Cracking Open Encryption Standards

Recent revelations about the extent of NSA surveillance have put even the standards by which encryption systems are designed into question. Encryption experts Matthew Green, Phillip Zimmerman, and Martin Hellman discuss what makes a code secure and the limits of privacy in the modern age.

Copyright © 2013 NPR. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

IRA FLATOW, HOST:

This is SCIENCE FRIDAY. I'm Ira Flatow. One of the selling points of storing your data on the Cloud or an encrypted file you created with your high-tech software was the idea that no one, no one else could break in, even with the use of the world's most advanced supercomputers. But with the revelation that the NSA can see inside some of the codes on the data, you were assured was uncrackable, what does it all mean now?

Joining me now to talk about it are some of the people who helped build the Internet security system. If you'd like to talk about this our number is 1-800-989-8255. And you can Tweet us at scifri. Matthew Green is an assistant research professor in the department of computer science at Johns Hopkins University in Baltimore. He specializes in cryptography. Welcome to SCIENCE FRIDAY.

MATT GREEN: Hi, it's nice to be here.

FLATOW: Thank you. Phil Zimmerman is the creator of PGP. You remember that? That's the encryption system involved in many communication systems, which used to be pretty good protection. He's also president and co-founder of Silent Circle Company developing secure messaging and voice communications that's based in the National Harbor, Maryland. Welcome back, Phil, to the program.

PHIL ZIMMERMAN: Thanks, Ira. I'm glad you remember. I was here in the late '90s.

FLATOW: Oh, when it comes to that stuff I never forget. Martin E. Hellman is a professor emeritus of electrical engineering at Stanford. He's perhaps best known for his invention of public key cryptography in cooperation with Whitfield Diffie and Ralph Merkle in the mid 1970s. Welcome to SCIENCE FRIDAY.

MARTIN HELLMAN: Thank you, Ira.

FLATOW: Let me - let's do some preliminary spade work here and talk. Matt, how was the NSA able to break into the supposedly secure files that we all thought were secure?

GREEN: Well, we still don't completely know the answer to that. What we know now is a lot more than we knew a month ago. What we thought a month ago was that the algorithms in the systems we had were very secure, secure enough that that was the part of the system we didn't have to worry about.

Unfortunately, what we've recently learned, thanks to a series of leaks, is that the NSA has been doing a number of things. They've actually been developing new attacks but they've also been going out and weakening some of the encryption standards and products that we use to make those files secure. And that seems to be one of the basic ways that they have to get by this encryption.

FLATOW: You mean, they helped create the standards and so they knew how to break them.

GREEN: Yes. In fact, the NSA's role, by law, is to both secure American information systems and also to attack foreign information systems. The problem here is that it's very hard to separate those two roles. So whenever they've had an opportunity to promote standards, it appears that they may have taken some of that opportunity to actually weaken the standards that we use here in America.

FLATOW: Phil Zimmerman, does that mean that there's nothing that can be encrypted now that can't be broken?

ZIMMERMAN: No, it doesn't mean that. There's a lot of encryption products out there and some of them are more carefully designed than others. I feel a little self-conscious in saying this but I'd like to point out that conspicuously absent from any of the Snowden documents is any mention of any of my products or...

(LAUGHTER)

ZIMMERMAN: ...so I'm feeling pretty good about that.

FLATOW: Do you think they haven't broken them, is that what you're saying?

ZIMMERMAN: I think they haven't because they still use them for classified - protecting classified data.

FLATOW: So what did you do in your product that made it unbreakable?

ZIMMERMAN: Well, you know, I'm kind of a fanatic about these things, and my OCD personality kind of helps.

(LAUGHTER)

FLATOW: But there was also some mention in the reading of - the research on this that the companies had to - were forced to cooperate with the federal government in allowing their stuff to be broken into, or giving them over the records they were looking for.

ZIMMERMAN: In some cases they were forced, if there was a particular case, you know, like if there was a court order. But in other cases I think it was a matter of the NSA just trying to get companies to cooperate with them in a friendly way, or maybe some way they were incentivized.

FLATOW: Martin Hellman, you and your colleagues developed the idea of a public key encryption years ago. Give us a brief sketch of how that works.

HELLMAN: Yeah, well, the other thing that we were involved in was fighting the weakening of the data encryption standard back in - when it came - was promulgated in 1975. One other thing I just want to add is there are many ways to get a data that's been encrypted without actually breaking the encryption system. There have been reports that Comodo, one of the certificate authorities may have been broken into by NSA. We used to think it was Iran that had done it.

And so you can work around encryption very often. I mean, if you get a key logger put on someone's machine, for example, no encryption will get around that because you can see what they're typing in before it's encrypted. Do you still want me to go back to...

FLATOW: No, no. Keep going with this. So this - in other words you can get into someone's computer and not have to break anything.

HELLMAN: Right. And another example - this actually happened with SmartCard. There was a really good secret key built into SmartCards which were in heavy us in Europe. And a colleague of mine, Paul Kocher, figured out how you could read the secret key off just by how much power the chip was using basically as the chip - it goes to the secret key and does its operations. It uses less power when there's a zero in the secret key and more power when there's a one.

So as it goes through the, let's say, thousand bits of secret key, you see low power, low power, high power and you know that's 001. You don't have to actually break the system. These are tempest attacks. So, there's lots of things that can be done.

FLATOW: If you go with a, if you hire a foreign company to store your data that is immune from American government, would that be a safer thing to do, so that the government can't ask to see your stuff?

HELLMAN: To paraphrase that, we don't know.

(LAUGHTER)

HELLMAN: First, we've got to worry about the foreign government and is the foreign government secure against NSA? Is it secure against the Russian equivalent? You know, it's a real problem. I think you have to be careful whenever you put something in the Cloud. And even put things on your computer. I mean...

ZIMMERMAN: A lot of the governments take advantage of the fact that you're giving your data to a third party. You're giving it to a service provider and then they approach the service provider and ask them for the data. That's probably going on in other countries too.

FLATOW: But you say, Phil, that you still think that you can - that your encryption stuff is good enough.

ZIMMERMAN: Well, I would rather not say that myself. I'd rather have others say that.

FLATOW: Are you putting a target on your head now?

HELLMAN: Ira, one other thing. It's important to recognize that there are tradeoffs - this is Martin Hellman again...

FLATOW: Yes.

HELLMAN: ...that there are tradeoffs involved here. And I mentioned, when your producer called me, that I'm not as active in this area as Phil or Matt is now. My primary concern for many years has been nuclear weapons and how we're going to survive possessing these things. And so right now everybody's up in arms over NSA's snooping, and rightly they should be. On the other hand, let there be a nuclear terrorist incident or even something like September 11th again and you'll find the country saying, who the hell was standing in the way of NSA's snooping?

So we need to recognize that there's a tradeoff here. And I also think we need to recognize that one of the reasons we're a target - the main reason we're a target as a nation for terrorist activity and NSA doing things like this has some beneficial interest, is because we have such a heavy military emphasis on our foreign policy. Basically, our civil liberties are collateral damage to our national security state.

FLATOW: Phil, would you agree with that?

ZIMMERMAN: Yeah, I think that we went a little crazy after 9/11. And if we are attacked again we might go a little crazy again. Maybe as the years have gone by we've recovered our sanity. And now - and, you know, if we get attacked again our insanity will be refreshed.

FLATOW: So we're talking about national security issues here, when national security is - it's a tradeoff to our privacy.

HELLMAN: Exactly. And underlying it all is the fact that we have basically in a 24/7, 365-day year national security state, which is causing this. If we want our civil liberties back we're going to have to become a lot less militaristic in our approach to things.

ZIMMERMAN: You know, in World War II we abridged the civil liberties of a lot of people, especially Japanese and put them in internment camps. And after the war was over we let them go and realized that that was a heavy-handed thing to do and inappropriate. And we kind of promised never to do it again. But that's because the war was over.

What's happening now is that the war never ends. And so each level - each increment of the encroachment of civil liberties becomes the new normal.

GREEN: So one of the - this is Matt Green - one of the points here is that we don't know what's going to happen in the future. We don't know who's going to be reading this information that's being collected by the NSA. We don't know who's going to be using these products that the NSA has weakened. So it's very important for us to make sure that in the time we have now, we build the infrastructure for people to have privacy and prevent that information from being scooped up into databases. And that's why what's happening now is so important.

HELLMAN: I would add something to that, Matt. it's also important that we have better oversights than we now have. Like, I was unaware until I read the newspaper report that Chief Justice Roberts appoints all the FISA judges. That's a potential single point failure.

GREEN: Yeah.

HELLMAN: Or depending on your politics, it is a single point failure.

GREEN: Yeah.

FLATOW: Explain that. Explain that to our listeners.

HELLMAN: Oh. There is this FISA court that has to approve warrants to go into - for domestic intelligence operations. And - but all of the judges are appointed by the Chief Justice of the Supreme Court, currently Chief Justice Roberts. And the news reports have indicated that, I think, nine of his 12 appointees have been Republican.

There's the potential for appointing a non-oversight oversight. And so we really need highly knowledgeable people, people like Matt and Phil, with high clearances who are reviewing what's being done and making sure that it's right. We shouldn't just have senators and congressmen involved in it.

FLATOW: Mm-hmm. Let me just play devil's advocate for a moment and, you know, in the height of the Cold War - on movies, on television, in the news - you would always hear when they talked about a Communist state. They would always say, well, we're doing this because of national security. You're able to - you know, we tap your phones, we listen in, we do whatever, because of national security.

Doesn't that make you think that maybe, you know, we are sort of falling into that same kind of rhetoric here?

GREEN: Yeah. Unfortunately, national security is the root password of the U.S. Constitution.

FLATOW: Mm-hmm. And it justifies any amount of loss of privacy.

GREEN: It appears. I mean, that seems to be the way things are done. I think we need to recognize that part of our national security is in the civil liberties that we have. We lose national security when we create a surveillance infrastructure that's so powerful, trained on our own domestic population that some future administration could abuse to remain in power.

We have no idea who will be in the White House in 2017. Will he have the moral sensibilities of Thomas Jefferson or Vladimir Putin? What would happen if a future government turns really bad and has the ability to know everything about everyone all the time?

FLATOW: Facial recognition everywhere.

GREEN: Yeah. Tracking of all of our movements, all of our transactions, all of our communications, every web page we look at, every little muscle twitch we make.

FLATOW: Mm-hmm. I'm Ira Flatow. This is SCIENCE FRIDAY from NPR talking with Phil Zimmerman and Matt Green and Martin Hellman. Phil, your company does secure chat and secure voice but it recently stopped secure email service. Why is that?

ZIMMERMAN: Well, email is different from secure telephony and secure instant messaging in that it involves persistent keys that have to be managed over a long term. And, you know, the work that I did on PGP was encryption software that runs on your laptop computer. But we didn't have a version of PGP that could run on a smartphone and it would take us a long time to write one.

And we'd have to write a mail application to go with it because you can't integrate something into, say, Apple's mail application or the one on Android. So it would take a lot of engineering effort that we didn't want to wait that long. So we put it on the server instead. But that means we'd have to store the keys on the server too and that means they're an invitation to a national security letter or some kind of court order.

And we felt that that was a bit too vulnerable. But, you know, even if we didn't store the keys there, there would still be the metadata that would be exposed there. Even if the message bodies are encrypted, as they would with PGP, there's still the metadata, the information that says who it's from, who it's to, the subject line, the date and time stamp, the IP address of where it came from, that sort of thing.

That's all exposed. And as we've seen from our recent revelations from Snowden, the metadata is extremely important. I mean, we always knew that in the business of - in the crypto business but we're seeing how that's applied in the real world and it is indeed an extremely powerful way of learning about what everyone's doing.

FLATOW: Mm-hmm. Matthew, Martin, any comment?

GREEN: Well, I think that we - the only comment I would have is that as cryptographers, as security people, we've known about these attacks for a long time. Things like the vulnerability of metadata and how email is difficult to secure. It's very interesting to see it from the other perspective, to see that there are people at the NSA who are thinking about exactly the same things as we are and exploiting them.

ZIMMERMAN: What I find so different from our theoretical considerations of what we imagined is that this is so breathtakingly comprehensive in its scope that we just never imagined it applied on such a grand scale.

FLATOW: What's that famous line? Failure of the imagination.

ZIMMERMAN: Yeah.

HELLMAN: Yeah. Martin Hellman here. Computers get cheaper - computation and storage gets cheaper by about a factor of 10 every five years so it's to the point now where you could store on roughly $100 hard drive every conversation you have through your whole life. And so there's a lot of potential there. But this is - while there's some of these new risks that we're facing, a lot of this is old.

Just recently, the George Washington University's national security archive, which is a great place for getting at formerly classified information using the Freedom of Information Act requests got a formerly classified NSA history that notes during the height of the Vietnam War protest movements in the late '60s and early '70s, NSA tapped the overseas communications of people like Martin Luther King, Muhammad Ali, New York Times journalists, and then Washington Post humor columnist Art Buchwald and most startlingly, they say, two prominent members of Congress, Senators Frank Church and Howard Baker.

So what's new, I think, is the potential scale of what they can store and what they can get at.

FLATOW: All right. We're going to take a break and talk lots more with Matthew Green, Phil Zimmerman, Martin Hellman. And your calls are number 1-800-989-8255. You can also tweet us at scifri. Go to our website sciencefriday.com, join the conversation there. We'll be right back after this break.

(SOUNDBITE OF MUSIC)

FLATOW: I'm Ira Flatow. This is SCIENCE FRIDAY from NPR.

(SOUNDBITE OF MUSIC)

FLATOW: This is SCIENCE FRIDAY from NPR. I'm Ira Flatow. We're talking this hour about encryption and security with my guests Matthew Green of Johns Hopkins University, Phil Zimmermann, creator of PGP and co-founder of Silent Circle, and Martin Hellman of Stanford. Our number, 1-800-989-8255. We're going to go to the phones. Before we go to the phones, though, Matt, I want to ask you what are some of the things that we know that the NSA has broken into?

GREEN: So we have heard a number of things that we can probably credit for real. One of them is 4G phones - cell phones is mentioned. Those phones use encryption to keep people from being able to intercept and listen to your phone calls. That encryption has been weakened in some way. We don't know how.

Another example, there are these things called random number generators that are used in almost every security product, including web servers. So when you go to Amazon.com and you see that little lock to protect your credit card number, there's a random number generator there. And we know that NSA through NIST, which is the National Institutes of Standards and Technology, has very likely put back doors in some of those standard algorithms that allow them to essentially break those systems entirely.

FLATOW: You mean the NSA created those back doors?

GREEN: That's exactly right. So NIST works with NSA and they're required to by law.

FLATOW: Mm-hmm.

GREEN: We thought that NSA was helping NIST build more secure standards for Americans to use. We now suspect and have strong evidence to believe that actually the situation was exactly the opposite, that NIST was being used to put out standards that the NSA could break.

FLATOW: If I go to the Cloud now and I want to store stuff in one of these Cloud services that says don't worry, we're absolutely secure, is that a little bit misleading?

ZIMMERMAN: In some cases it might be, but, you know, I think that regarding the random number generator, NIST has published several random number generators and one of them is just egregiously awful. And cryptographers recognized years ago that it was egregiously awful in its design and most - no competent cryptographer would knowingly use it unless he was turned to the dark side.

GREEN: And yet...

ZIMMERMAN: And yet RSA did a security - did use it as their default random number generator. And they do have competent cryptographers working there. So.

FLATOW: How do you explain that?

ZIMMERMAN: Well, I'm not going to - I think I'd rather not be the one to say.

(LAUGHTER)

FLATOW: But if someone else were to say it, what would they say?

ZIMMERMAN: Well, someone else might say that maybe they were incentivized.

HELLMAN: Let me - this is Martin Hellman. Let me bring up a question for the other two guys. Back in the - maybe we need a less confrontational approach to NSA. And I'll explain why historically I believe that. Back in the '70s and '80s and in the '90s when we were in confrontation with NSA over their attempts to weaken security, we didn't get good security.

They used their export - their ability to control export to also control the domestic market because companies didn't want to have separate - most companies didn't want to have separate products domestically and internationally. And so while we fought a good fight, we didn't really get what we wanted.

It was only after a National Research Council committee on which I served had a representative from NSA, a representative from law enforcement, as well as privacy advocates like myself and we got together and talked that we were able to get reasonably good crypto exportable.

And so what if the option is if we don't have NSA's blessing everybody will be able to listen in, whereas with NSA's blessings, maybe the government will be able to listen in and maybe there will be adequate oversight if maybe we can negotiate that.

GREEN: No.

HELLMAN: What do you guys - is it different today?

GREEN: Well, we did get the export controls lifted at the end of the '90s and now strong crypto is the norm in the industry. We're able to export strong crypto and we're able to design our own strong crypto without any help from NSA. Most of the NIST standards are good standards. They publish quite a few documents and I've read many of them and they're really excellent.

It's only in rare cases - I mean, there was that random number generator and I haven't found anything else. Perhaps maybe some - there are some elliptic curves that they published that are not as strong as we now know how to design. But they still use them for secret and top secret government application. So I don't think that there's any weakness in them.

ZIMMERMAN: Well, that's the really surprising part of this. And I just want to be clear. When we talk about crypto, we're not talking about hackers and criminals and members of Anonymous communicating with each other over the Internet. We're talking about you. We're talking about people visiting websites.

GREEN: Yeah.

ZIMMERMAN: We're talking about companies exchanging data with banks.

GREEN: Sure.

ZIMMERMAN: Banks exchanging data with the federal government, even sensitive data that might have national security implications.

GREEN: Well, our health care records are encrypted by law. In fact, you know, I mean, the legislative environment today is so different than it was in the '90s. In the '90s if you used strong crypto you had to explain yourself. You had to make excuses. You had to defend allegations that you're a criminal and that you're up to no good and you want to hide your wrongdoing with strong crypto.

Today, if you don't use strong crypto you have to explain yourself. You may be legally liable if you're not using strong crypto. If you're a doctor and you don't encrypt your patient records, then why aren't you? You know, you're not HIPAA compliant. If you leave your laptop in a taxi with 200,000 customer identities on it, you better hope that disc is encrypted because if it's not you have an obligation to come clean with the public and tell them that you've lost 200,000 customer identities.

In England, it's even worse. You could even face criminal liabilities in that case. So crypto is required by law. For Sarbanes-Oxley, you know, for best practices of protecting corporate assets, you have to use crypto. And I mean there's all kinds of legal obligations to use strong crypto today. It's now mainstream.

FLATOW: But if you have no, quote/unquote "good reason" to use strong crypto, if you don't belong to any of those groups and you're just a geeky guy or woman who wants to have strong crypto. Do you then become the target? Because what reason do you have to have this unless you're doing something bad.

ZIMMERMAN: Well, everyone does online banking and every time you order something from Amazon you enter your credit card number and you've got to protect that with strong crypto.

GREEN: Every time you use Facebook or Gmail you're using strong crypto whether you notice it or not. That crypto is there and it's also there to protect those companies from any kind of interception.

FLATOW: Well, I had assumed that Skype was using cryptography for its voice calls until I learned that they voluntarily allowed the government to go in there.

GREEN: Yeah. As I mentioned before, sometimes these companies voluntarily cooperate with the government.

FLATOW: Mm-hmm. 1-800-989-8255. Let's get some callers in. Peter in Berkeley. Hi, Peter.

PETER: Thank you, gentlemen, for this fascinating and very important discussion. It's kind of like being the founding fathers of a new age and I really appreciate it. So I'd like to comment about we need a less confrontational approach with NSA. I'm a patriot, like most Americans, and I don't mind being looked at, even helped in some way with some intrusion online, but I'd just like to know the source. I'd like the source to be identified.

I'd like us to lean in overt versus covert intelligence and trust that honesty is, you know, the great virtue that can guide us, both the lookers and the participators. But last night, to give an example, I was working on a review under deadline for, by the way, a fabulous musical, "1776," out here in San Francisco with ACP.

FLATOW: It's a great musical, yeah.

PETER: In a great theater. But here I was and I really wanted to get this out. And all of a sudden, in my Gmail, where I was composing, a lot of the text selectively got really tiny compared to the rest. And it looked like somebody was really there online with me, you know, manipulating, making suggestions. Well, look at this phrase again and look at that. But it was kind of discouraging, you know, also to be intruded and not know who was doing it.

And I ended up not getting it out. So how can we protect our Gmail? A lot of people are Gmail users.

ZIMMERMAN: I don't know how to tell you how to protect your Gmail because in the case of Gmail Google scans Gmail for key words and uses it to guide their decisions about what kind of advertising to show you. I don't use Gmail and I don't use Facebook. I, you know, I prefer to use email services that are private. Actually, I have my own mail server in my closet, so - but I don't know if I'm the norm.

I would find a way to do email that's more private.

GREEN: So I do think that the answer is, like we just discussed, the way that these web connections work, the Gmail, they're secured. We use cryptography to prevent people from intercepting those connections, from tampering with your account, from getting malware onto your computer. That's why we use this crypto. And we don't want it to be weakened for that reason.

Yeah.

FLATOW: Well, when you go into Gmail, the EULA, the End User License Agreement, tells you that they're going to be snooping around in your Gmail and you shouldn't be surprised by that.

ZIMMERMAN: Yeah. A lot of these Internet companies monetize customer data and that's their business model. They're providing a free service. And in fact, their customer - you're not their customer. You're an asset that they sell to their real customers, which is advertisers. You know, I have a startup company called Silent Circle that encrypts phone calls and text messages and we don't monetize anything except by collecting money from customers that pay us actual money.

And so, you know, our business model is part of our security model. We don't sell you out to anyone. We just - all we want is your money. Just give us your money and no one gets hurt.

(LAUGHTER)

HELLMAN: There's no free lunch. There's no free crypto.

ZIMMERMAN: That's right. That's right.

FLATOW: Matt, if you're looking to break into an encryption system, how do you go about doing that?

GREEN: So we know that there are kind of three ways to do this. The one that you should not probably try is the hardest, which is to go after the actual mathematics of the encryption. The modern algorithms we use are mostly very strong and you could spend the rest of your life trying to break RSA or AS, which are two of the common algorithms we use on the Web.

The way that you would probably do the best if you were trying to break into an encryption system is to attack the software that implements it. And I think Bruce Schneier, who is a very famous cryptographer, said it best when he looked at the Snowden documents himself. He said that the mathematics is strong but the code has been subverted.

And by that he meant that the software we were using had in some way been weakened, even the crypto itself was strong. And so that seems to be the best way to go about weakening crypto.

ZIMMERMAN: Sometimes this means that they will attack your computer or your mobile phone with some kind of attack that subverts the platform, the computer, instead of trying to break the crypto with cryptanalysis. Think of it as a - if you imagine if you have a steel door on your house, instead of trying to drill through six inches of solid steel, someone could just smash the window with their fist, reach inside, and turn the doorknob and open it.

FLATOW: Mm-hmm. I'm Ira Flatow. This is SCIENCE FRIDAY from NPR. Let's go to the phones, see if we've got a phone call in. Frank in Washington D.C. Hi, Frank.

FRANK: Hi. I'm calling about how the encryption system of PGP is also associated with the trapdoor codes. I think that came up already since I called.

FLATOW: Mm-hmm.

FRANK: That large prime numbers will make this kind of encryption more hard - harder to break.

FLATOW: Mm-hmm. Let me ask Phil.

ZIMMERMAN: Yeah. I want to hasten to add here that, you know, trapdoors is a different thing than back doors. Trapdoor functions are...

FLATOW: Trade craft talk here.

ZIMMERMAN: Yeah.

FLATOW: Tell us about that.

ZIMMERMAN: I think he was just saying something nice about PGP.

(LAUGHTER)

FLATOW: So what's the different between a trapdoor and a back door?

ZIMMERMAN: A back door is some kind of secret way that you can break in without knowing the key, or to get your hands on the key.

FLATOW: Mm-hmm.

ZIMMERMAN: A trapdoor is - in mathematics there are some mathematical functions that are easy to calculate in one direction but very hard to calculate in the reverse direction.

FLATOW: Mm-hmm. Let me see if - in the few minutes that I have left, let me ask all of you to where you see the future going with all of this. Let me start with you, Phil.

ZIMMERMAN: Well, you know, when I think about - I mean, I often hear this question of where do I think the future is going. And to be the question of trying to predict the future is I never really want to try to predict the future. Instead, I try to make the future. The best way to predict the future is to make the future. What we should do is - there's something a little passive about predicting the future.

What we should be doing is thinking of what we'd like to have in the future and working as hard as we can to get there. And what I'd like to have, is I'd like to change the laws. We can't do it all with technology. We have to push back in policy space. We have to change the laws. We have to realize that what we've done here is we went a little crazy after 9/11 and we've created this powerful surveillance infrastructure that's just incredibly powerful.

That the NSA has created with 35,000 mathematicians and scientists applying their genius to create this thing. And it's so powerful that it is too tempting to the rest of the government.

FLATOW: Mm-hmm.

ZIMMERMAN: It's like that "Lord of the Rings" character - who was...

GREEN: Boromir.

ZIMMERMAN: Boromir, right. Who wanted to use the ring for good instead of evil. But in fact, the ring is so powerful that you can't. You just can't.

FLATOW: Now, let me get - we're running out of time. Let me get another comment. Matt, what do you think?

GREEN: Well, I think the public policy answers are important. We need to go to Congress; we need to get answers on this. I look at the technical answers. I think the very quick answer for me is that we need to start using open source software where it matters. We need to be able to see the code we're using and we need to know if there are back doors in it. So that's a start, anyway.

ZIMMERMAN: Amen.

FLATOW: And Martin?

HELLMAN: Better oversight, which I emphasized before. You know, less confrontational approach, not just with NSA but a less confrontational approach as a nation. If we keep getting into wars that we don't need to get into, then we're going to be this target and we're need this kind of snooping and we're going to have collateral damage to our civil liberties.

FLATOW: All right. Gentlemen, I thank you for taking a lot of your time to be talking about this. It's just the beginning. There's lots to talk about. Phil Zimmerman, creator of PGP, also president and cofounder of Silent Circle. That's based in National Harbor, Maryland. Martin Hellman, professor emeritus of electrical engineering at Stanford. And Matthew Green, assistant research professor in the department of computer science at Johns Hopkins University in Baltimore. Thank you all for taking time to be with us today.

HELLMAN: Bye, Ira.

FLATOW: You're welcome.

ZIMMERMAN: So long.

GREEN: Thank you. It's a pleasure to be here.

Copyright © 2013 NPR. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to NPR. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Comments

 

Please keep your community civil. All comments must follow the NPR.org Community rules and terms of use, and will be moderated prior to posting. NPR reserves the right to use the comments we receive, in whole or in part, and to use the commenter's name and location, in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

Support comes from: