Turing Award Winner On Future Of Tech
MELISSA BLOCK, host:
This is ALL THINGS CONSIDERED from NPR News. I'm Melissa Block.
ROBERT SIEGEL, host:
I'm Robert Siegel.
And it's time now for All Tech Considered.
(Soundbite of music)
SIEGEL: The Association for Computing Machinery recently awarded its A.M. Turin Award to the computer pioneer Charles Thacker. Thacker helped develop the Alto, a precursor to the personal computer in 1973. The Turin prize is often called the Nobel of computer science. Today, Charles Thacker does research for Microsoft, and he's also been working on, among other things, their tablet computer. And he joins us today. Welcome to the program.
Mr. CHARLES THACKER (Microsoft Researcher, A.M. Turin Award Winner): Thank you.
SIEGEL: The Alto, tell us a little about that machine designed that led to the personal computer.
Mr. THACKER: The Alto was actually the first computer that had a bitmap display and a mouse and fit under a desk. So it actually was a personal machine. It was, I believe, almost the first time that one computer, one person was the rule.
SIEGEL: If we were to look for something today that's still a matter of research and that stands a good chance of going as universal as the ideas of the Alto did, what is it? What's 10 years, eight years away from realization on the market right now?
Mr. THACKER: Right now the industry is actually at a point of inflection because it has proven to be very difficult to make computers go any faster than they do today, that is, run programs faster. So, instead of seeing faster computers, which we've seen for 30 years, we're just simply going to see more of them. The problem of that is getting those computers to work well together to solve a single problem. This is the problem of parallel programming. And there's a very intense amount of work being done today on exactly how to solve that problem.
SIEGEL: What would be an application of parallel programming?
Mr. THACKER: Well, there are a lot of things that people don't do that way today that they're going to need to. Examples might be things like speech and handwriting recognition. The computer today is actually remarkably inefficient at understanding what the user is doing or wants to do. And being able to do better interaction would be one example of something that might yield effectively to parallelism.
SIEGEL: You've been working on tablet computers for quite a while, since the 1990s, I gather.
Mr. THACKER: That's correct.
SIEGEL: How has the idea evolved over that time?
Mr. THACKER: Not a whole lot. There have been some changes in the way the user interface works. But by and large, it's a matter of trying with the tablet to mimic the characteristics of paper and pencil or paper and brush as closely as possible. And so, that's what people have been trying to do. And we're getting closer and closer to being able to do that, but we're not there yet.
SIEGEL: When the Palm Pilot hit the markets, people had to learn a way of writing. In other words, yes, it can recognize your writing after you learn how to write the way you have to write on a Palm Pilot. Not the most user-friendly idea.
Mr. THACKER: Not user friendly, but incredibly effective. I mean, people were willing to change their behavior because the benefits that accrued from doing so were so great.
SIEGEL: That must be a common question in computer science, which is, how much will people accommodate us to have this incredible machine that can do this for them and to what extent do we have to meet them in terms of the way that they live and the way they do things without this machine?
Mr. THACKER: For a long time, people have been accommodating to the computer and in many cases they would prefer not to. So, it behooves us as developers and engineers to try make the computer easier to use.
SIEGEL: Can you imagine the perfect computer - what it would do?
Mr. THACKER: My wife, actually, has something to say about this and I'll share it with you. She said, the best computer is the invisible computer. She does not use computers. I'm her secretary. I take her email from her friends and so on.
(Soundbite of laughter)
SIEGEL: I see.
Mr. THACKER: Which is a little surprising because we've had computers in our house as long as there have been computers. And I would like the perfect computer to be unobtrusive and easy to operate. For instance, I would prefer not to have to stare at a screen. I'd prefer a computer that actually projected an image on my retina so that I could both be looking at my environment and also have computer-generated things in that environment. People are beginning to call this augmented reality. It should talk to me. I shouldn't need to read text from it. It should cue me in when something interesting happens.
SIEGEL: Now, my understanding, from what I've read in just a couple of articles about innovation in computers, because of the phenomenon of the capacity of semiconductors increasing so rapidly, I gather the problem that you folks would face is that if you invented for the semiconductors that we know, by the time your product came to market it would be obsolete already. So you'd have to invent for something that hasn't quite happened yet but you can predict will be there.
Mr. THACKER: That's correct. And we have always been able to predict. Even at the time of the Alto, Gordon Moore had stated his famous law about the rate at which semiconductors would progress, and we understood it fairly well. So we were, in a sense, designing for the future.
SIEGEL: But does that mean that in developing computers, ultimately, you're always building something that if you get it right, it shouldn't quite work yet, at the point that you've made it.
Mr. THACKER: It's not so much that it shouldn't quite work yet, it's that it will be more expensive than it will ultimately be. We see this, for instance, in the video game business. The first instance of a new video game console is extremely expensive because it uses cutting edge technology. But eventually Moore's Law improves the underlying technology and the console manufacturer uses Moore's Law not to provide more speed, because gamers want the game to play at the same rate, but to lower the cost. So eventually, throughout the five-year life cycle of a given console, you'll lose money at the beginning, but you'll make money at the end.
SIEGEL: You've been at this for quite a while, and there used to be a lot of lore that people would come in, master a particular technology and a decade later, everything they knew was obsolescent. What is the life for veteran computer scientists and for people several decades after their first huge breakthrough?
Mr. THACKER: Well, one of the nice things about computing is that it's a relatively new field. A lot has been learned in 50 years. And it is possible to go back and look at some of those early papers and still find gold in those hills. We have to continually reassess ideas in light of the new technologies that we now have. I've been advocating recently that we look back at some of the things that were developed, like, 40 years ago and say, were these things really a bad idea? Should we have gone down the path that we went down? And I think the answers might be surprising.
SIEGEL: Well, Charles Thacker, thank you very much for talking with us about all these things.
Mr. THACKER: My pleasure.
SIEGEL: Charles Thacker, winner of this year's A.M. Turin Award, also known as the Nobel Prize of computer science.
NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR’s programming is the audio.