TERRY GROSS, HOST:
This is FRESH AIR. I'm Terry Gross. The digital world that we've come to rely on - the Internet, social networks, GPS's, street maps - also creates opportunities to collect information about us, track our movements and invade our privacy. Add to that brain scans that might reveal criminal tendencies and new developments in genetic medicine and biotechnology, and you have a lot of potential challenges to basic constitutional principles that our founding father couldn't possibly have imagined.
My guest, Jeffrey Rosen has put together a new book that explores those challenges. Along with Benjamin Wittes, he co-edited "Constitution 3.0: Freedom and Technological Change." It's a publication of the Brookings Institution's Project on Technology and the Constitution, which Rosen directs. He's also a law professor at George Washington University and legal editor for The New Republic.
His new book is a collection of essays in which a diverse group of legal scholars imagine plausible technological developments in or near the year 2025 that would stress current constitutional law, and they propose possible solutions.
Jeffrey Rosen, welcome back to FRESH AIR. So what are the particular parts of the Constitution that you think really come into play here with new technologies?
JEFFREY ROSEN: Well, what's so striking is that none of the existing amendments give clear answers to the most basic questions we're having today. So, for example, think about global positioning system technologies, which the Supreme Court is now considering. Can the police, without a warrant, put a secret GPS device on the bottom of someone's car and track him 24/7 for a month?
Well, the relevant constitutional text is the Fourth Amendment, which says the right of the people to be secure in their persons, houses, papers and effects against unreasonable searches and seizures, shall not be violated. But that doesn't answer the question: Is it an unreasonable search of our persons or effects to be monitored in public spaces?
Some courts have said no. Several lower court judges and the Obama administration argue that we have no expectation of privacy in public, because it's theoretically possible for our neighbors to put a tail on us or for the police to track us for 100 miles, as the court has said. Therefore, we have to assume the risk that we're being monitored, ubiquitously, 24/7 for a month.
But not everyone agrees. In a visionary opinion, Judge Douglas Ginsburg on the U.S. Court of Appeals for the D.C. Circuit said there's a tremendous difference between short-term and long-term surveillance. We may expect that our neighbors are watching when we walk on the street for a few blocks, but no one in practice expects to be tailed or surveilled for a month.
Ginsburg said we do have an expectation of privacy in the whole of our movements, and therefore when the police are going to engage in long-term surveillance, because they can learn so much more about us, they should have a warrant.
There was a remarkable moment in the oral argument for the global positioning system case. Chief Justice John Roberts, who asked the first question, he said: Isn't there a difference between 100-mile search of the kind we've approved in the past and watching someone for a month?
The government's lawyer resisted, and Roberts said: Is it the U.S. government's position that the police could put GPS devices inside the clothes of the members of this court, of these justices, or under our cars and track us for a month? And when the government's lawyer said yes, I think he may have lost the case.
GROSS: So the GPS question is facing the Supreme Court now. When the Supreme Court rules - no matter how they rule, is that going to be a really important precedent in determining the future of how courts interpret our right to privacy and our right to protect against surveillance in this new Internet era?
ROSEN: The GPS case has the potential to be the most important privacy case of the decade. Now it's possible the justices will rule narrowly. Justice Antonin Scalia and Justice Anthony Kennedy both were focused on the fact that there was a physical trespass when the police put a GPS device on the car without a warrant.
They interfered with the owner's property interests in the car, and therefore, said Scalia, it's a search, and a warrant should be required. If the court rules narrowly in that way, it won't answer all these fundamental questions about the future of surveillance.
But what makes the case so crucial, such a galvanizing test for the court, is this: Will the justices be willing to look beyond the existing Fourth Amendment categories, which have been inadequate to confront these new virtual technologies, and take a leap of imagination?
Really, the leap they're being asked to take is the one that Justice Brandeis took in the 1920s when the court decided for the first time the constitutionality of wiretapping.
There, again, there was an answer under existing law, and it was wooden and unsatisfying. A majority of the court in the Olmstead case in an opinion by Chief Justice William Howard Taft, the former U.S. president, asked: Is it an unconstitutional search when the police put wiretaps under the sidewalk of the office of the suspected bootlegger and eavesdropped on his telephone conversations and then conclude that he's exchanging booze and arrest him?
A majority of the court said no, no trespass, no search. Because the police didn't have to enter into the bootleggers' private property to place the wiretap, he had no expectation of privacy in his conversations.
But in a visionary dissenting opinion, Justice Louis Brandeis, the greatest theorist of privacy of the 20th century, disagreed. Brandeis noted that at the time of the framing of the Constitution, a far less intrusive search, namely breaking into someone's home and riffling through their desk drawers to read anonymous pamphlets to identify the author of a critic of King George III, that was the quintessential example of an unreasonable search.
But now, said Brandeis, it's possible to invade the privacy of people of both ends of a telephone wire. And then in his remarkable passage, Brandeis literally looked forward to the age of cyberspace. And he said: Ways may someday be developed by which it's possible, without breaking into desk drawers, to extract papers from home and introduce them in court before a jury.
Because a lesser invasion was unreasonable at the time of the framing, Brandeis said the court should translate the Constitution and recognize that you don't need a physical trespass to create an unreasonable search.
GROSS: So in your chapter in the new book "Constitution 3.0," you imagine Open Planet 2025, a system that might exist in 2025. Explain that, what the system would be, Open Planet.
ROSEN: Yes, so this was a response to a question at a conference that Google held in 2007. And the head of public policy at Google - then it was Andrew McLaughlin - said he imagined within a few years, Google and Facebook would be asked to put live, online all the public and private surveillance cameras that are now blanketing the globe.
And in fact there are already small apps that do this. I think Facebook has an app that allows you to look at beach cameras in Mexico, which are popular with teenage boys for reasons you can imagine.
But McLaughlin said: What if Google or Facebook were to put the live feeds, link and archive them? It would be theoretically possible in this system to click on a picture of anyone in the world, say me, back-click on me to find out where I'd come from in the morning, forward-click on me to see where I'm going and basically have 24/7 ubiquitous surveillance of everyone on the planet at all times.
This is a sort of GPS case on steroids. It's not just your car being surveilled, it's you, and it's anywhere in the world at all moments. McLaughlin said: First of all, should Google do this? And second, would it violate the Constitution?
And the fact that there was no clear answer to that question when he spoke - obviously the Supreme Court may give us more of an answer in this GPS case - can - interested me and made me think how inadequate our current constitutional doctrine is to resolve the most profound privacy cases of our age.
So it was a question like that that really inspired the project of trying to think through what the right answer would be, and I can imagine a whole series of different answers that courts might give to Open Planet.
GROSS: In your book, you question: So what if, in 2025, Facebook does this, it links all public and private surveillance on Americans and puts us all live online? And although this may seem far-fetched, most of the architecture already exists for this.
So what questions would that raise?
ROSEN: The first question, the most striking question, is: It's not even clear that Facebook would be regulated by the Constitution. After all, Facebook is a private actor. The Constitution, the Fourth Amendment only prohibits unreasonable searches and seizures committed by government actors.
So a court or Facebook's lawyers might say Mark Zuckerberg can do what he likes, the Constitution has nothing to say, and if he wants to wire up Open Planet, and people want to use it, there's nothing anyone can do about it except not go outside because Facebook is unregulated.
I think that helps crystallize the fact that at the moment, lawyers at Facebook and Google and Microsoft have more power over the future of privacy and free expression than any king or president or Supreme Court justice. And we can't rely simply on judges enforcing the existing Constitution to protect the values that the Framers took for granted.
Now, a court might hold that if Facebook is wiring up cameras that are owned by the government and by the private sector, or if the FBI or the police are using the Facebook Open Planet system to surveil suspected terrorists, there's enough state action to trigger the Constitution.
But that's just not - it's not a technical question. It's the reality of the power that private intermediaries have over civil liberties today that pervades so many of the constitutional challenges that we're thinking about in the book.
GROSS: So if the Constitution can't really, you know, regulate corporations like Facebook or Google, could there be, like, legislation, and is that something that you think would be healthy or not, you know, legislation that would restrict what kind of information companies like Google, Facebook, Twitter could make public without people's consent?
ROSEN: There could be legislation, and there should be legislation. There's a bill pending in Congress right now, a geolocational privacy bill, and it's bipartisan. It's introduced by Senator Ron Wyden, the Oregon Democrat, and Representative Josh Chaffetz, the Utah Republican. [POST-BROADCAST CORRECTION: Representative JASON Chaffetz.]
And Chaffetz and Wyden would say that before the government can have access to geolocational data held on our iPhones or in databases, it needs some sort of warrant or standard of cause. I hope very much that that geolocational bill passes. The fact that it's bipartisan is really encouraging, and it's a dramatic example of the fact that legislatures may be better than judges in regulating private corporations.
A number of states have passed geolocational privacy bills, as well, and I really hope that congressional bill will pass.
GROSS: Well, let's talk about a technology that already exists, that a lot of people are already aware of. And I'm thinking of the kind of Google Maps, the street views where you can see street action: traffic, people walking down the street. And the person who - or the store that is there does not need to give their consent to have this kind of view.
And you might call it just, you know, a real-time view of public space, but to others, it might look like surveillance.
ROSEN: Absolutely. It's a great example of how hard it is to draw the line between a snapshot of someone's movements and ubiquitous dragnet surveillance. So street view got up and running with the Google vans that would travel down the street and take pictures of public places and post them on the net.
There was an outcry against this in Germany, which has a much greater concern about data collection by the private sector than we do in America, understandably because of their history with totalitarianism. And because of the objection of German privacy commissioners, Google allowed Germany to opt out of street view, and basically the trucks don't surveil Germany as extensively as they do in America.
In America, we haven't made that choice, and when the snapshots were just still action and allowed you to see the front of a house but not people moving down the street, I think no one would have thought that raised privacy, constitutional concerns in the American sense, because courts have held that we don't have expectations of privacy for being captured in short snippets on the street.
But as the feeds go live, then they raise the very same questions that Open Planet does and force the court to draw hard lines. Maybe there the line isn't between short-term and long-term surveillance, which is the line that the court's being asked to embrace in the GPS case, but between still and live action.
We're also about to see a dramatic clash between the U.S. and Europe on these matters, because street view could be challenged in Europe, not only by German privacy commissioners who want the trucks to go away, but also under European data laws.
So, for example, right now, if I take my smartphone and take a picture of you on the street, standing in front of a bar or at a political protest, and then I say I don't know who you are, and I plug it into Facebook, and using face recognition technology, identify you, based on other photos of you in the Facebook database and then post that on the net with your name, that's clearly legal in America, no expectation of privacy on the street.
In Europe, that could be challenged under European data laws, which give us a right to our image. There are certain exceptions for news-gathering and for science, for public purposes, but essentially the Europeans are already increasingly recognizing a right in our image that could restrict you from taking pictures on the street, and given the fact that Facebook and Google operate all over the world, the clash between these European and American visions are going to be very dramatic indeed.
GROSS: So one of the questions you raise in your chapter in "Constitution 3.0" is: What is protected speech when you put something on the Internet? And you go back to a 2008 case, where a 25-year-old teacher-in-training at a high school, I guess she was a student teacher, she posted a photo on her MySpace page that showed her at a party wearing a pirate hat and drinking from a plastic cup with the caption: Drunken Pirate.
Now, what were the consequences of her having put this photo on her MySpace page?
ROSEN: Her supervisors at the school where she taught decided that she was promoting underage drinking, that this was conduct unbecoming to a teacher. And they fired her days before her graduation from teacher's college, and as a result, she didn't graduate.
She sued. She said this was legal conduct, there's nothing wrong with having a plastic cup, and there's no evidence that she was even drinking alcohol and that it was a violation of her First Amendment rights to fire her for posting a legal picture on MySpace.
In a rather cursory opinion, a federal judge rejected the claim. He said that she was a public employee, and her speech was not on a matter of public concern, and therefore it wasn't protected by the First Amendment. And as a result, she never graduated from teacher's college. She chose a different career. She's now working in human resources. But her life was dramatically changed because of this one picture that she posted.
GROSS: So it raises the question: If you put something on your own personal page on the Internet, do other people have the right to use it against you and fire you in the way that she was fired? What are the arguments for protecting her speech? You just gave the argument for allowing her to be fired, dismissed.
ROSEN: The arguments for protecting her speech are that we all need a right to escape our past on the Internet. The idea that you could be permanently tarred by one mistake you make, a picture about you that's taken out of context that never goes away and which haunts you for the rest of your days is inconsistent with the American ideal of self-invention, the idea that you should be able to escape our past.
There's that old idea GTT, gone to Texas. You can leave your debts behind you. Well, in the age of the Internet, that is an increasingly elusive idea. Now, how to create that or state that claim as a legal right is not at all obvious, and it's not clear it should be a legal right because of the free speech interests on the other side.
So Europe, once again, is taking the lead in this regard. The French data privacy commissioner has said that there should be a legal right to escape your past on the Internet. He calls it the droit a l'oubli, the right to oblivion, and the basic idea is that if you've put up a picture that you later think the better of, and it's embarrassing to you, you should be able to take it down.
The details of how it's going to be enforced are not clear. He wants to create an international body, some kind of international commission of forgetfulness that would decide on a case-by-case basis what stays up and what goes down. But the free speech consequences of this are obvious.
There was an Argentinean case recently that applied the right to oblivion. An Argentinean pop star had posed for racy pictures, as pop stars sometimes do, and then she thought the better of them and wanted to take them down.
And she petitioned Google and Yahoo! to remove the pictures. They both refused, and an Argentinean judge, invoking this pop star's right to oblivion or her moral dignitary rights, ordered Google and Yahoo! to take down the pictures, fined them thousands of dollars a day.
Yahoo! said it's too hard for us just to take down the racy pictures, we're going to have to remove all references to this woman on the Internet. So now if you plug in her name, nothing comes up.
That to me poses so starkly the clash between the privacy interests on the one hand and the free speech interests on the other. The European data privacy commissioner has endorsed a kind of right to oblivion, and we're going to see huge legal battles as people struggle to delete their past using legal rights.
GROSS: Isn't it - just the fact that Europe has a data privacy commission says something? We don't have an equivalent, do we?
ROSEN: We don't. We're suspicious of government regulators. There have been efforts over the years to propose a meaningful privacy commission in America. Essentially we regulate private-sector data gathering much less vigorously than Europe does because we don't have a tradition in America of protecting a right to dignity.
GROSS: Jeffrey Rosen will be back in the second half of the show. He co-edited the new book "Constitution 3.0: Freedom and Technological Change." It's a project of the Brookings Institution's Project on Technology and the Constitution, which Rosen directs. He's also legal editor for the New Republic. I'm Terry Gross, and this is FRESH AIR.
(SOUNDBITE OF MUSIC)
GROSS: This is FRESH AIR. I'm Terry Gross. We're talking about technological developments from social networks to GPS's and brain scans that pose potential challenges to our constitutional principles of privacy, freedom of speech and protection against unreasonable search and seizure. My guest Jeffrey Rosen directs the Brookings Institution's Project on Technology and the Constitution, which published the new book "Constitution 3.0: Freedom and Technological Change." Rosen co-edited the book with Benjamin Wittes. Rosen is also a law professor at George Washington University and legal editor for The New Republic.
Now, you write in your chapter in the book that there are people at Google and Facebook and Twitter who really have more control over privacy and free speech than the courts do right now. And you call Nicole Wong the decider. She's the deputy counsel for Google. Why do you call her that?
ROSEN: Her colleagues called her that because she was the one - she resigned recently, but until very recently, she was the lawyer at Google who decided what went up and what stayed down on not only the hundred some-odd Google search engines throughout the world, but also on YouTube, which Google owns. So Nicole Wong was the one who was woken up in the middle of the night by calls from the Turkish government, for example, which was really upset because Greek football fans were posting videos on YouTube saying that Kemal Ataturk, the founder of modern Turkey, was gay, which is illegal in Turkey, to insult Ataturk.
And Wong, who doesn't speak Turkish, has to decide in the middle of the night: Is this a clear violation of Turkish law, in which case it comes down? Is it an edge case and should be protected as political speech? In the end, she decided to block access to certain videos through Turkish IP addresses. But that wasn't enough for a Turkish judge, who wanted these - access blocked for the whole country. And access to Google, as a result, was completely blocked in Turkey for a long time. So Nicole Wong, for me, just concretely dramatizes this fact, that lawyers at Facebook and Google have more power over free speech than any king or president or Supreme Court justice.
GROSS: So how are these high-tech companies like Google dealing with things like terrorist videos on the Internet?
ROSEN: So Google has been under a lot of pressure - in particular from Senator Joseph Lieberman - to remove terrorist videos on the Internet. And Lieberman gave Google a list of videos that he wanted removed, and Google agreed to take down those videos that violate their terms of service, which prohibit speech that incites violence. They recently added a category, promotes terrorism, but they refused to remove all the videos because they said that they were protected speech. And as a result, Lieberman was upset, but he got some of what he wanted.
More recently, Twitter was pressured by Lieberman to remove feeds to pro-Taliban tweets. And Twitter's even more pro-free speech than Google. It doesn't have a promotes terrorism exception. It will only remove content that is illegal or that promotes violence. And since they concluded that these were essentially pro-Taliban news feeds that didn't promote imminent violence, they refused to remove the tweets. Again, this shows the particular user standards and codes of conduct adopted by Google and Twitter are more important than the First Amendment in determining who can speak and who can be heard.
GROSS: So even though you think that corporations basically now are making a lot more decisions about privacy and the Internet than the Supreme Court is, the Supreme Court faces the GPS decision now, and there will be other decisions coming to the court.
So on the Supreme Court now, you have several justices who define themselves as originalists, meaning that they want to interpret the Constitution in a pretty literal way as they believe the Founding Fathers intended it at that time. And so how have originalists so far been dealing with new technologies that the Founding Fathers couldn't possibly have dreamed of?
ROSEN: You know, they haven't done so badly, actually. Surprisingly to many people, one of the best decisions on new technologies was written by Justice Antonin Scalia, the originalist-in-chief. It was a case involving thermal imaging technology from 2003, where the cops basically used this technology to measure the heat on the outside of the house. Based on the measurements, they decided that one area of the house was really hot. They concluded he was using heat lamps to grow marijuana. Then they got a warrant and broke in and found that he was growing pot, and they arrested him.
He objected. He said that the use of the thermal imaging technology initially without the warrant was an unreasonable search, and in a five-to-four decision written by Justice Scalia, the court agreed. And Scalia said that people had to have at least as much privacy inside their homes as the Framers took for granted. And since this technology could reveal intimate details of the home, such as the hour in the day that the lady of the house was taking her daily sauna bath, was his example, therefore, a warrant should have been required.
GROSS: I want to ask you about a chapter written by Orin Kerr, and tell us who he is.
ROSEN: Orin Kerr is a colleague of mine at George Washington Law School. He's a superb thinker about privacy and surveillance, and is one of the leading commentators about the Fourth Amendment.
GROSS: So he writes about the increasing use of surveillance and data mining technologies, and he focuses on monitoring systems in subways, systems that could monitor subway riders as they enter and exit the station by collecting their fingerprints. Now, this isn't being done now, but it could be done.
GROSS: And he points out that this could foil terrorist plots. It could help solve crimes. At the same time, it would be collecting a lot of data on individuals who weren't terrorists and who weren't involved in criminal activity. And he suggests that rather than restricting the collection of this data, legislators should be paying more attention to the use of the data after it's collected. Can you talk about that point of view?
ROSEN: Yes. It is a very important suggestion, which would go a long way towards solving many of the privacy issues we're concerned about. So Orin Kerr recognizes that if these monitoring systems are really just used for terrorism and they're not used to go after people for low-level offenses - for example, if you're in a divorce trial and you want to check the subway fingerprint database to find out whether your spouse was committing adultery - then many people would support the use of the monitoring systems.
Kerr's model, use limitations, has been embraced by Germany - which, once again, is a leader in avoiding the dangers of totalitarianism. The German intelligence services have broad discretion to seize a lot of data, to engage in warrantless monitoring, but they can only use what they find for terrorism cases and they can only share it with the police in terrorism cases or violent crimes. If they find evidence of adultery, the intelligence people cannot share it with the police. So that's an example of the use limitation that Orin Kerr is endorsing, and I very much agree with him that if Congress would adopt those use limitations, we'd be in a much better place.
GROSS: But once data is collected and stored, who knows how it's going to be used? Who knows who's going to secretly and illegally get access to that information and use it against somebody?
ROSEN: Absolutely. And as you say, there can be break-ins. The use rules can be violated. Now, audit trails are helpful here, and we have the ability to record every unauthorized access to a database. So you can think of pro-privacy security measures that would really protect privacy and security at the same time, but use limitations are an imperfect solution.
GROSS: What does the Patriot Act have to say about surveillance technologies?
ROSEN: The Patriot Act dramatically expanded the amount of surveillance the government could do without a warrant. So Section 215 of the Patriot Act says that the government can seize any data, any tangible thing, as long it's relevant to a terrorism investigation. That's library receipts or diaries or computer records or bank records, or anything at all. Before the Patriot Act, you generally had to get a warrant to seize that material. And if you were going to get it without a warrant, then the government had to certify that you were a suspected spy or terrorist before you could get the material.
But by eliminating that requirement and allowing the government to seize anything just by saying it was relevant to a terrorism investigation, it essentially was a green light for surveillance. And plenty of independent studies have shown that this authority has been used repeatedly not to go after suspected terrorists, but instead immigration offenses, low-level crimes that have nothing to do with terrorism. Essentially, the surveillance authority is not being used for terrorism.
GROSS: My guest is Jeffrey Rosen. He co-edited the new book "Constitution 3.0: Freedom and Technological Change." We'll talk more after a break. This is FRESH AIR.
(SOUNDBITE OF MUSIC)
GROSS: If you're just joining us, my guest is Jeffrey Rosen. He's a law professor at George Washington University. He's the Brookings Institution's director of the Project of Technology and the Constitution. And he's the co-editor of the new book "Constitution 3.0: Freedom and Technological Change."
Let's look a little bit at how the new MRIs, the fMRIs that can do brain scanning could be used in courts of law and the questions that's raising. Cognitive scientists believe that these brain scans can tell us which part of the brain is being lit up during certain thoughts. They can - what else can they tell us that might be used in legal ways?
ROSEN: Well, fMRIs have the potential to change our notion of criminal responsibility. There's a new defense that's increasingly coming up that's called My Brain Made Me Do It. And essentially, after someone's been convicted at the sentencing phase, they'll try to introduce fMRI evidence that the amygdala - which is the area of the brain responsible for emotion and impulse - is overactive, and the prefrontal cortex - which is supposed to be the restraint or the conscience on the amygdala - is not doing its job. And based on that evidence, they're trying to claim that they didn't have the ability to control themselves, and therefore should be, if not - if their guilt shouldn't be excused, it should at least be mitigated.
Now, so far, jurors haven't been buying this much, but it's becoming de rigueur in death penalty cases. And people speculate that as brain scans become more granular, you might be able to do preemptive screening on people and find out that they have a predisposition to violence because of their overactive amygdalas and detain them indefinitely as, you know, potentially antisocial people. So the question of whether neurolaw will, in fact, transform the legal system is a really interesting one.
GROSS: Yeah. And if you decided to imprison somebody because they had a predisposition to commit crime, you'd be incarcerating them for something that hasn't happened.
ROSEN: Absolutely. And it seems like such a basic violation of the liberal principle that we should be held responsible for what we do, not for what we think. But it's interesting. There's not a clear constitutional provision that would prevent the government from imprisoning us for our predispositions. The nearest analogue is maybe the constitutional prohibitions on bills of attainder.
Essentially, Congress can't pass a law making me an outlaw, a stranger to the law and saying my blood is attaint and all of my descendents don't have any legal rights. That was what they were concerned about at the time of the framing. And you could say that it's a sort of attainder to be imprisoned for your predispositions. But the analogy clearly isn't perfect. The court might hold on the other side that we have no expectations of privacy in our brain waves. I put out my brain waves the same way I put out the trash, as the court has held in one case. And therefore, any notion of cognitive liberty isn't protected by the Constitution.
GROSS: So do you think this is going to become more and more of an issue as the technology gets more and more refined?
ROSEN: There's no question. Jurors are especially impressed with these pretty pictures. When they see pictures of the brain lighting up and tend to give that much more weight than dry testimony about someone's background. But I'm not so convinced that there will be a radical challenge to the very notion of criminal responsibility. I mean, there are some who say, already, because our brains make us do everything, the very notion of retribution is obsolete - you can't punish people for their freely chosen action, because none of our actions are freely chosen. And therefore, maybe you could lock people up for deterrence reasons so they don't commit future crimes, but it doesn't make any sense to hold people blameworthy in any sense, because we're not responsible for our actions.
I don't buy that, because like Stephen Morse, one of the contributors to our volume, Morse notes that the law generally doesn't care why we act the way we did. It could because we had bad mommies and daddies or because we ate too many Twinkies. But as long as we're not acting under duress - in other words, with a gun to the head, you know, do it or else - or we're not insane, criminally insane, therefore not able to understand the consequences of our action, usually the law does treat us as if we're autonomous beings.
GROSS: So let me, for a moment, just elevate you to the status of the one and only Supreme Court justice who has to make all the decisions...
(SOUNDBITE OF LAUGHTER)
ROSEN: Oh, I'd be very - very bad at that.
GROSS: ...about the First and Fourth Amendment interpretations.
(SOUNDBITE OF LAUGHTER)
ROSEN: I don't want that job. Let Nicole Wong do it. That's not for me.
(SOUNDBITE OF LAUGHTER)
GROSS: Okay. But you have it. You have it for the next couple of minutes.
GROSS: What are some of the interpretations you would give to the First and Fourth Amendment in the face of some of the new technology we've been talking about?
ROSEN: I think we need to recognize that it is an unreasonable search of our persons and effects to be the objects of long-term dragnet surveillance. We do have an expectation of privacy in public and both the government and private individuals cannot collect so much information about us that we're under virtual dragnets for extended periods of time. That's the first thing that I would recognize.
And I really hope the court will do just that in the GPS case. That's why it's such an important case. More broadly, and I'm not sure - I would not as a Supreme Court Justice embrace this myself for right now, because I think the court should be cautious in embracing very abstract constitutional principles in the face of new technologies, but you could imagine a justice like Justice Anthony Kennedy recognizing a broad right of personal autonomy that was linked not only to the Fourth Amendment but also the First Amendment.
Remember, it's in Roe v. Wade and the case upholding it, Casey v. Planned Parenthood, that Kennedy said, at the heart of liberty is the right to define one's own conception of the meaning of the universe and the mystery of human life. Justice Scalia unkindly ridiculed that as the sweet mystery of life passage, but there is an idea that when the government objectifies and surveils us, it's violating our autonomy and making it impossible for us to make basic decisions about intimacy and about our public identities.
GROSS: So now that you've been focusing on issues of privacy and surveillance in this era of new technology, what do you see around you that you never noticed before? Obviously surveillance cameras. What else?
ROSEN: Well, my mind is often leaping ahead to these science fiction constructs, like we have a wonderful chapter about whether you can copyright human life. And James Boyle from Duke imagines a kind of sapient computer called Hal or a sex doll genetically engineered that has no brainstem but looks like a human being, and wondered whether the court would allow these to be copyrighted because of the prohibition on patenting human life, that to create a property interest in a human being violates the constitutional prohibition against slavery.
So I'm definitely more sensitive to these sort of futuristic scenarios, but more broadly I sense in the way that devices are interacting more and more, so when I go down the street, I think increasingly about this notion of an Internet of things which is developing, which will allow devices to talk to each other.
So smart cars are now collecting a tremendous amount of data about how fast we're going and how much gas we're using and monitoring body functions and so forth. And if this were given to insurance companies, they could charge us different rates on the basis of that. The smart cars can talk to mobile devices and there are experiments being done that would allow the car to sense whether or not a human being was walking down the street and at what distance and to stop automatically.
But essentially you become more sensitive to the way that smart devices are interacting with each other and it can definitely change what you're thinking about when you're walking to school.
GROSS: Well, Jeffrey Rosen, I want to thank you so much for talking with us.
ROSEN: Thank you, Terry.
GROSS: Jeffrey Rosen co-edited the new book "Constitution 3.0: Freedom and Technological Change." He directs the Brookings Institution's Project on Technology and the Constitution and is a law professor at George Washington University and legal editor for The New Republic. You'll find links to some of his New York Times magazine articles on our website, freshair.npr.org.
NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.