Joy Casino Ап Икс Debating the Constitution: Technology and Privacy - Open to Debate
June 6, 2017
June 6, 2017

Do you have a secret that no one else knows?  What about Apple, Google, Facebook, Verizon, or Uber?  Are you sure they don’t know your secret?  Digital data – emails, text messages, phone records, location records, web searches – contain traces of almost every secret.  They also contain traces of almost every crime.  Tech companies may promise to protect our data from prying eyes.  But should that promise yield to law enforcement and national security?

The Motion: Tech Companies Should Be Required To Help Law Enforcement Execute Search Warrants To Access Customer Data

09:00 PM Tuesday, June 6, 2017
  • 00:00:00

    [Applause]

    John Donvan:

    In a time when companies like Amazon and Google and Facebook are piling up mountains of data about us, the one place left in our digital lives where true privacy can be found exists, oddly enough, on our smartphones, which are designed so that when you put that phone on lock, no one can get past its encryption — not even, say, Apple with its iPhone or Google with its Pixel, which is great, right?

    But not if you’re in law enforcement and you’ve got reason to believe that a bad guy’s phone contains secrets that can solve crimes and stop terrorist attacks. Well, in that case, should Apple or should Google help the Feds bust the encryption? Isn’t doing anything you can to help in such cases every citizen’s doing? Isn’t it patriotic? Or is the sort of privacy that encryption represents something sacrosanct and not to mention something fragile?

  • 00:01:07

    You put a backdoor into it, who knows who might come through it later?

    Well, this all sounds like the makings of a debate, so let’s have it. Yes or no to this statement: Tech companies should be required to help law enforcement execute search warrants to access customer data. That’s our debate.

    We are in San Francisco at the SF Jazz Center in partnership with the National Constitution Center with four superbly qualified debaters who will argue for and against the motion. Our debate goes in three rounds, and then the audience here in San Francisco votes to choose the winner and only one side wins.

    What we’d like to have you do now is vote your opinion as you come in off the street to tell us where you stand on this motion. Again, take a look at the language. It’s a lot of words. Tech companies should be required to help law enforcement execute search warrants to execute customer data. Go to the key pad at your seat.

  • 00:02:00

    Press number one if you agree with the motion, the motion — the side that will be argued by this team. Push number two if you disagree with the motion, this team’s position. Push number three if you are undecided, which is a perfectly reasonable position to be in as the debate starts.

    Tonight we are debating the responsibility of tech companies when the government comes asking for data, especially encrypted data. We have one team that will be arguing in support of that idea. Let’s first meet the first debater arguing for the motion. Please welcome Stewart Baker.

    [Applause]

    So, Stewart, you’ve server in government, in important positions. You were general counsel for the NSA. You served under President George W. Bush at the Department of Homeland Security as its first assistant secretary for policy. You have long argued that folks who oppose government access to the kind of data we’ll be talking about tonight underappreciate how access to that data can enhance our security.

  • 00:03:04

    So where does that appreciation come from? What do you know that they don’t?

    Stewart Baker:

    So it’s not what I know, it’s who I know. I’ve seen those people who are at the FBI, at NSA, at DHS who are trying to protect us — more than half of them joined after 9/11 because of 9/11. But they are absolutely under-resourced, overwhelmed. They need our help. Without our help, they will not succeed. And that’s why I believe that everyone owes them a duty of providing assistance when they can.

    John Donvan:

    And it’s why you’re on this side.

    Stewart Baker:

    Yes.

    John Donvan:

    Can you tell us please who your partner will be tonight?

    Stewart Baker:

    He is my debating partner now for the second time in Intelligence Squared, John Yoo. He is a pleasure. I’d share a foxhole with him any day.

    John Donvan:

    Ladies and gentlemen, John Yoo.

    [Applause]

    John, you’re a professor of law at Berkeley, a visiting scholar at the American Enterprise Institute.

  • 00:04:03

    Following the September 11th attacks, you worked on national security and terrorism issues at the Department of Justice, wrote some controversial memos — which will be in your obituary.

    [Laughter]

    Third time you’ve debated with us. But the last time we actually did it in Philadelphia, which is your hometown, and your mother was in the audience. And you told us then that there was no way you could lose with her sitting there. Well, she’s not here tonight, so what does that do — what does that do to your game?

    John Yoo:

    Well, since it’s the fourth time — you keep inviting me, I keep losing. So, you’re the ones with the problem.

    [Laughter]

    John Yoo:

    And the second — just comes to mind is — you know, Mom probably works at NSA now, so she’s still in the audience listening anyway.

    John Donvan:

    [laughs] All right. Ladies and gentlemen, the team arguing for this motion.

    [Applause]

    And now let’s meet the team arguing against the motion. First, welcome Michael Chertoff.

    [Applause]

  • 00:05:00

    Michael, you’re — you’ve debated with us a number of times before as well. Welcome back. You are the co-founder of the Chertoff Group. You were the second Secretary of Homeland Security under George W. Bush. Before that, you headed the Department of Justice’s criminal division. You were a federal judge. But way, way back, you were a young prosecutor and you helped to put quite a few mob figures behind bars back in those days. Of course, back in those days, there was really no digital data like we have it today. But honestly, would it have made your job easier, if there had been?

    Michael Chertoff:

    Yeah. We — first of all, let me just say I’m delighted to be here, and both John and Stewart were colleagues when I was in government. You know, we did it the old-fashioned way. Guys used to be wiretapped or they’d have electronic surveillance. They’d leave the room. They’d walk around the block. They’d turn the radio up loud. So, we made our cases with witnesses, photographs, circumstantial evidence, and we were successful. In fact, I put a bunch of guys away for 100 years apiece.

  • 00:06:03

    John Donvan:

    And did you have to ride a horse back and forth to work?

    Michael Chertoff:

    It wasn’t that long ago. [laughs]

    John Donvan:

    And please tell us who your partner is.

    Michael Chertoff:

    Catherine Crump is my partner. She is a professor at Berkeley. I’ve not had the privilege of debating with her before, but I’m looking forward to it this time.

    John Donvan:

    Ladies and gentlemen, Catherine Crump.

    [Applause]

    And Catherine, as Michael said, you’re a professor of law also at Berkeley, and acting director of it’s Samuelsson Law Technology and Public Policy Clinic. For nine years you were a staff attorney for the ACLU. You have been sounding alarms for years about the staggering amount of data that law enforcement can and does collect on people’s actual movements by tracing their cell phones, by photographing license plates. Day to day, what steps do you take to make yourself less digitally visible? Or is it not even possible anymore?

    Catherine Crump:

    Well, you know, today it’s pretty tough.

  • 00:07:00

    Online you have tools like Signal [spelled phonetically] that can help you retain some measure of privacy in your data. But sometimes, in physical space, it’s hard to do much but smile for the cameras.

    John Donvan:

    [laughs] All right. Thanks, Catherine Crump. And the team arguing against the motion.

    [Applause]

    Okay. Now we move on to Round 1. Round 1 will be opening statements by each debater in turn. They will be six minutes each. And speaking first for the motion, Tech Companies Should Be Required To Help Law Enforcement Execute Search Warrants To Access Customer Data — here is Stewart Baker, former general counsel for the National Security Agency.

    Ladies and gentlemen, Stewart Baker.

    [Applause]

    Stewart Baker:

    Thanks, John. The way it was divided, John and I have decided to divide the argument. I’ll be talking about the obligations that all citizens have to help law enforcement when necessary, which I believe leads on to the obligation of tech companies to provide assistance.

  • 00:08:09

    John will be talking about why, particularly today, we need help with law enforcement from technology companies. So let me start. And I’d like to start always when I do this with the question, which is whether tech companies should be required to help law enforcement execute search warts to gain customer data. And I want to stress what that question doesn’t require you to support in order to come out in the affirmative. We’d love it if you concluded that the government can require companies to put back doors in their products or break their crypto.

    If you believe that, then you’re obviously going to support this motion. But that’s not what the proposition says. It says they should be “required to help law enforcement.”

  • 00:09:03

    And to my mind, that does not mean they’re always required, but they are sometimes required to help law enforcement. And that’s not really a surprise because everybody is required to help law enforcement in the right circumstances. If you have a unique ability to help law enforcement, and law enforcement can’t solve the problem on its own, you have an obligation to assist law enforcement. This has been true for hundreds of years, well before the United States was founded. There was a common law obligation to assist law enforcement upon request, particularly when only you could provide that assistance. A — since then, we all — actually, we all understand this. If you are witness to a crime, if you have evidence in a file cabinet behind your desk of a crime, you’re going to get a subpoena from the government, and you have an obligation to assist the government by providing them with the evidence that you already have.

  • 00:10:10

    This is the rule for all of us. You’re going to get subpoenas, you’re going to get search warrants, you’re going to get requests for that data. If you’re a landlord and your tenant is suspected of engaging in drug selling or some other crime, the government’s going to come or the police will come with a police warrant and they will say, “Can you help us? We don’t want to knock down the door. We’d like you to use your master key to get us in, particularly because that will allow us to get in without the subject knowing that he’s being investigated, and that may turn out to be important.” And you have an obligation as a landlord to provide that assistance. This is a requirement for all of us. And it’s not different for tech companies. There is no Silicon Valley exceptionalism policy that applies.

  • 00:11:05

    In fact, the Supreme Court has said exactly that in a case against — which the United States was asking for help from New York Telephone, now Verizon, saying “We would like you to assist us in carrying out an intercept of communications data online.” And what the company that’s now Verizon said, “No. We don’t feel like it. Why don’t you do it? You can run a line to the wire tap.” And the government said, “You are in a unique position to assist us in a way that will not be obvious to the criminal and therefore you have an obligation to provide that assistance.” The Supreme Court said, “Yeah, that’s right. There is no special exception for phone companies or tech companies. You need to provide that because it’s part of your obligation as a citizen.”

  • 00:12:02

    I guess I shouldn’t sit down without mentioning the elephant in the room which is, of course, Apple against FBI. And I want to make clear that while I’m pretty skeptical about Apple’s arguments in that case, you don’t have to be entirely skeptical to vote in the affirmative in this case. There is no one who’s arguing here that the obligation to help law enforcement is without boundary. If you can show that it’s too burdensome, that the government can do this without your help, that it’s going to cost too much, it’s going to hurt your customers, if you can make a persuasive argument that under current law you don’t have to provide the assistance, but if you can, you are required to provide that assistance.

    The difference between — the one place where I think Apple made a statement, made an argument that is inconsistent with voting for this proposition is when they said, “Oh, we can help. We just don’t want to.”

  • 00:13:08

    That is exactly a defiance of the obligation that every other citizen has to provide assistance to the government. And there’s no exception that says, just because you’re the world’s wealthiest company you don’t have to do this. And so if you agree with that proposition that there isn’t a tech — a Silicon Valley exception from the obligations of citizenship, then you ought to vote in support of this motion. Thank you.

    John Donvan:

    Thank you, Stewart Baker.

    [Applause]

    And that motion is, Tech companies should be required to help law enforcement execute search warrants to access customer data. Our next debater who will argue against that motion is Catherine Crump, professor of law at UC Berkeley and former staff attorney for the ACLU. Ladies and gentlemen, Catherine Crump.

    [Applause]

  • 00:14:00

    Catherine Crump:

    You don’t need to believe that there’s a Silicon Valley exception to the obligation to help in order to oppose this resolution. This debate is not about whether tech companies should hand over evidence that they’re capable of accessing in response to a properly obtained warrant. Of course they should. This debate is about whether the government, by controlling the use of strong encryption or through other mandates, can obligate companies like Apple to design devices like iPhones less securely in order to facilitate the government’s access to data. The answer to that question should be no. In this era of profound cyber insecurity, the government’s role should be to encourage companies to design devices more securely, not less securely.

    I am going to talk about the importance of encryptions preserving free speech and commerce online. And then my partner, Michael Chertoff will talk about why encryption — the widespread availability of encryption enhances our national security rather than detracts from it.

  • 00:15:10

    We rely on the internet for virtually everything. We use it to communicate with friends and loved ones, to understand medical diagnose, and to engage in banking. Corporations store their most valuable proprietary information online, and the government also stores vast troves of data digitally, including law enforcement and national security information. As a result, the security of the internet is critical. Yet the systems we rely on to store all of this data are radically insecure. Earlier this year, PEW [spelled phonetically] reported that over half of Americans have personally experienced a major data breach.

    The issue is urgent. Having the content of your email account dumped online can be devastating.

  • 00:16:00

    Just ask Hillary Clinton’s campaign manager who not only found that personally embarrassing, but that well could have affected the course of the presidential election. Our data is leaking all the time in large volumes. Companies have repeatedly failed to protect it. People increasingly realize that their data is vulnerable. If we want the internet to continue to be a place where speech and commerce flourish, we need to have people be able to share their thoughts and their credit card numbers over the internet. Strong encryption is the best defense available against the growing threat of cyber-attacks. For strong encryption, [unintelligible] encryption that for all intents and purposes unbreakable. When strong encryption is deployed, users hold the keys to their own data. That means that data is safe from prying eyes, including the eyes of tech companies. Some suggest that we ought to build a back door in order to allow law enforcement access to data.

  • 00:17:03

    The problem with that is you cannot build a back door that works only for the U.S. government, good guys, or other people with good motives. If you build it for them, the encryption will be weakened for everyone. No one can guarantee the security of a back door, not tech companies, not the government, no one. That’s the lesson of the recent one [unintelligible] Branson work scandal. Over 200,000 computers in 150 countries were taken hostage. Why and how? Using a tool of vulnerability that the NSA lost control over. The capacity for securing these types of secrets to the extent it used to exist is no longer present today. The closer you look at the issue of the feasibility of creating a back door the more impractical such a solution becomes. Just think about this: What phones would it apply to?

  • 00:18:02

    Would it apply to older phones? Would they be grandfathered in? What about phones that are built overseas? When a German traveler comes to the United States and their phone is noncompliant, will they have to surrender that phone at the border? If so, that’s a massive inconvenience for everyone who wants to come here. If not, that’s a huge loophole for any kind of such requirement. Who should be able to recover data? If the answer is that tech companies should be able to recover data, well, they won’t be just pressured to — by the United States government in order to make data available, but by every government around the world, no matter how despotic or corrupt. And if the answer isn’t that tech companies should control this but rather U.S. law enforcement, well, good look ever selling that phone overseas. To me, the issue is not about protecting us from the government. We have the rule of law, the Fourth Amendment, due process, and a culture of compliance to help us with that.

  • 00:19:04

    The issue is protecting us from the bad guys. There are a lot more bad guys out there than law enforcement agents. If we create an opportunity for government agents to use a backdoor, then that’s going to be taken advantage of many times over by criminals. They are unconstrained by laws and norms, and they don’t get warrants. If they know there is a key or some other way to access data, they will do everything they can to obtain it, and that key will become a central failure point. That is exactly what the distributed structure of the internet was designed to prevent. In this age of radical insecurity, we will all be better off if companies increase security for user data rather than weaken protections.

    John Donvan:

    Thank you, Catherine Crump.

    [Applause]

    And a reminder of what’s going on. We are halfway through the opening round of this Intelligence Squared US debate. I’m John Donovan.

  • 00:20:01

    We have four debaters, two teams of two fighting it out over this motion: Tech companies should be required to help law enforcement execute search warrants to access customer data. You’ve heard the first two opening statements, and now on to the third. John, you can make your way to the lectern. Again, John Yoo, professor of law at UC Berkeley, arguing for the motion. Ladies and gentlemen, John Yoo.

    [Applause]

    John Yoo:

    Thank you. It’s wonderful to be here. This is a great venue. It is the cleanest jazz club I’ve ever been in.

    [Laughter]

    I don’t understand what kind of jazz is being played here, but it’s probably safe for all ages. It’s a great pleasure to be here. It’s also wonderful to be back with IQ Squared. All the jokes don’t come out of my time, by the way. It’s great to be here with IQ2 again. As I said —

    John Donvan:

    Actually, they do, so –

    John Yoo:

    Well, then I’m going to make them longer. So it’s great to be here because this is the fourth time that I’ve had the pleasure — actually, third time I’ve had the pleasure of losing, so I’m willing to do anything to win this time.

  • 00:21:05

    I am not going to go down as a four-time loser on these things. And Stewart, my partner, we debated last together about a year and a half ago in Philadelphia, and we lost. So I told him to pander to the high-tech audience as much as possible. So what did he do? He didn’t wear a tie. That’s pandering for people from Washington, D.C.

    [Applause]

    So it’s great to be with Stewart.

    It’s also my great pleasure to be debating against my friend Mike Chertoff, who I often think of as the finest lawyer I’ve ever encountered, certainly in government service. I’ve never been an opponent of his, though, and so now after many years of friendship and working together, I’m finally going to be able to say what the hell I think about him. So watch out, Mike. And it’s great to be here with my dear colleague Catherine. I should have required when I voted for her hiring at Berkeley that she agree never to debate me in public.

    [Laughter]

    I forgot to leave that in her contract. However, I’m glad she’s here — and now I’m going to get serious — because I think I heard her concede on the question presented.

  • 00:22:05

    Right? The question presented is a very simple one. Should tech companies help law enforcement. And I think Catherine said, yes, of course they should. So nobody has to listen to anything we say after that point. She cleverly — and this is why she’s so smart and we hired her at Berkeley — is she cleverly changed the question into a debate about encryption. I don’t know anything about encryption. I don’t care about encryption. I care about the Constitution, which is why I’m ready to debate impeachment at the next debate, by the way. But I care about the Constitution. And the Constitution doesn’t say anything about encryption. What the Constitution says is — and let me pull out my prop — I’m going to win this time — so I’ve got the Fourth Amendment, right? You should all get one of these. They’re pocket Constitutions. Actually, it’s a little-known secret, if you write to the Supreme Court, they will send you a free one. I don’t know. They all have different versions, though. I’m not sure which one you’re going to get. So.

  • 00:23:00

    [Laughter]

    The Fourth Amendment says: The right of the people to be secure in their persons, houses, papers and effects against unreasonable searches and seizures shall not be violated. Notice, it doesn’t say “against all searches and seizures” or against “some searches and unreasonable searches and seizures.” I hope everyone here tonight will at least agree that that’s the standard. What’s reasonable? And look, I’m not going to tell you what is reasonable. I’m a citizen, like you. We all have our views on reasonableness. Ultimately, I think it’s for our elected representatives who we send to Washington to vote on legislation to decide what’s reasonable or not. That’s how we handle other changes in technology, from the telephone, to money transfers — to all kinds of things where, at the beginning, everyone said, “Oh, it’s so different. We should just have no rules or totally new rules.” And what we did is we adapted the rules we had in the past to the new situation we have before us, in a reasonable way.

  • 00:24:03

    And that’s the way our society operates and that’s what we should do, I think, with data held by tech companies. If you think it’s reasonable to use reasonableness, this is how I would do the test. According to the Supreme Court in many cases — one’s called Tennessee v. Garter [spelled phonetically], the court has said when you just reasonableness, you balance the benefits of pursuing a particular action — in terms of whether it advances a government interest — versus the loss of privacy. It’s not a categorical, “Everything’s off limits” or categorically “The government can do whatever it wants.” It calls on us to make a balancing choice. And we have chosen repeatedly to ask the judges to do that for us. So, what would be in the balance? In this case, the balance would be the reduction of the possibility of terrorist attacks. I think Stewart mentioned that it would be remiss not to mention Apple versus FBI.

  • 00:25:01

    I also think it would be remiss not to mention that in the United Kingdom — the nation most similar to us probably in the world — has just suffered two terrible terrorist attacks in the space of a week and has suffered even more in the last few months, and that — we’ve seen just as a spate of terrorist attacks not just against the United Kingdom, but terrible attacks in Nice, Paris, Brussels. And let’s not forget the United States. And just — I think sometimes we have a short attention span. We’re being led by a president who has an even shorter attention span — and I think we forget that things have just been happening the last few years in our country.

    Just in 2013, four years ago, terrorists bombed the Boston Marathon, killed three people, injured 260 people. Just two years ago, in San Bernardino, 14 people were killed by terrorists, 22 injured. Just last year, Orlando. The gay nightclub was attacked, again, by a terrorist. 49 people killed.

  • 00:26:01

    53 people injured. The reason I mention these is not to raise a scare that there’s terrorists all around, but just — the government’s interest is to try to reduce those attacks. And the only way to do it in this kind of world we’re living in with terrorists that organize themselves as networks where they share information and take advantage of global commerce is to get data and information on them if we’re to have any chance to try to stop the attacks from succeeding in the future. Thanks very much.

    [Applause]

    John Donvan:

    Thank you, John Yoo. And now making his statement against the motion and making his way to the lectern, Michael Chertoff, former Secretary of Homeland Security. Ladies and gentlemen, Michael Chertoff.

    [Applause]

    Michael Chertoff:

    Well, John, I’m not going to match you joke for joke. I’m going to need all my time. I want to begin on a serious note. We obviously all deeply feel for the families that lost loved ones in London and in Manchester, and all over the world.

  • 00:27:03

    And we know it’s very important to do the best we can to stop these kinds of things from happening. And I will tell you — and I say this with the experience of having been on duty on September 11th — the kinds of capabilities that tech companies provide to the U.S. government that can be used to detect and prevent terrorism is vastly greater than it’s ever been. There is a treasure trove of information that has developed not only through metadata — who is contacting whom, who is calling or sending messages to whom — but locational data, video data, all of this enhanced with artificial intelligence and analytics — all of this is made available to the government when it’s in the possession of a company, provided the government has a warrant, or a subpoena, or some appropriate legal process. When I look at the resolution, the resolution doesn’t say that we resolve — that the tech companies should comply with the law and lawful process, because nobody debates it.

  • 00:28:03

    The question is, should the tech companies be required to go beyond subpoenas and warrants, and either turn things over based on a voluntary basis — where there is not a basis to get a warrant or there is no subpoena. Or even more significantly, should tech companies be required to take steps to weaken encryption or other measures that protect information simply because right now those tech companies don’t have access to the information and therefore they can’t comply with the requirement that it be turned over. And let me be clear exactly what I’m talking about. There are now many different kinds of applications you can use that don’t give the service provider the ability to access the data. So when they get a subpoena or they get a search warrant, they give over what they can, and that may be the identity of the phone or the IPA address. They turn over what may be available to them in terms of routing the messages from one point to another.

  • 00:29:05

    That’s the meta data. But they do not have the ability to turn over the message in an unencrypted form, or in the case of some applications, the messages have disappeared like under WhatsApp, and therefore there’s nothing to turn over. What the government is arguing for and what this resolution is arguing for is that the tech companies have to go further. They have to organize themselves so that they have the ability to decrypt with a duplicate key all the data that gets transferred, so that they have the ability to store things that you think you have deleted so they can turn that over if there’s a request. And the fact of the matter is, under the Constitution and the traditions of this country, we do not require people to organize their lives so they store everything that they say and everything that they write so it will be available if somebody wants to come along later and investigate them.

    So what are we talking about here?

  • 00:30:00

    We’re talking about not just the values of the Constitution, but we’re actually talking about our national security. Because if you open up the newspapers, what you see is you have foreign nations hacking into our political parties, you have criminals stealing our financial data, you have terrorists trying to get information about where Americans might be going, particularly American service men and women so they can target them. And the way to protect that data is not to expect the government to do it because they don’t, is to expect each individual to take the steps necessary to protect that information. And often that does require encryption or sometimes it requires using an application in which the message addition appears once it’s been read. Are there bad people who can use these things for their own purposes? Absolutely. But more significantly are the number of good people who use these to protect themselves. And I would say to you as a matter of national security is the ability to let the vast majority protect themselves that’s got to be the highest value.

  • 00:31:05

    I will also tell you that the world is not going to go dark and we’re not going to be in mortal peril if in fact we have encrypted communication or disappearing messages that can’t be seized by the government. As I pointed out earlier, there are an enormous number of tools that are now available through meta data, locational data and similar things that the government can get and routinely gets from the tech companies. And even in the case of the San Bernardino folks, the data that had been uploaded to the cloud was turned over to the government. It was only the data on the phone that had not been uploaded that was inaccessible to the tech companies and allegedly to the government, although the government actually eventually broke into it. So there is plenty out there to protect us.

  • 00:31:56

    I would also say to you that even if there were a rule that said U.S. tech companies must have the capability to decrypt any message or must have the ability to store and retrieve any message even if it “disappears,” that wouldn’t stop the bad guys because they would go to other parts of the world or they would go onto the dark web and they would simply buy encryption that couldn’t be opened, or they would simply buy a tool that allows them to make messages disappear. So what would happen is we would have reduced the protection for the law-abiding people, and we would not really have deterred the people who are not law abiding. I believe the government and the tech companies can work together but in a way that there’s not sacrifice to security of the innocent person who wants to protect his or her own financial data, private information, and health information. Thank you very much.

    [Applause]

    John Donvan:

    Thank you, Michael Chertoff. And that concludes round one of this Intelligence Squared U.S. debate. And now we move on to round two. And in round two, the debaters take questions from me.

  • 00:33:01

    They can address one another directly, and they also will take questions from you, our live audience here in San Francisco at the SF Jazz Center. We have two teams arguing for and against this motion: Tech companies should be required to help law enforcement execute search warrants to access customer data. The team arcing for the motion, Stewart Baker and John Yoo, have been arguing that the law and the history is clear. There is an obligation to help law enforcement especially if you have a unique ability to offer that help, that you cannot say no in that case but within reasonable bounds. They say reasonableness is to be decided by the courts. They emphasize those boundaries are there. However, they also point out that there is no Silicon Valley exception to that rule, making reference to the position that Apple took in a case with the FBI in which, paraphrasing, Apple said, We can, but we won’t. They say that just like a landlord with a master key at the scene of a crime, they need to be able to turn the key.

  • 00:34:01

    The team arguing against the motion, Catherine Crump and Michael Chertoff, they’re arguing that strong encryption and encryption is, to a significant degree, the crux of the matter as this issue moves forward in the present day. Strong encryption is the best defense against cyber-attack. You can’t build a back door that only works for the good guys. They are also arguing that there’s plenty of other ways for law enforcement to use data that’s already available to them through search warrants that are not resisted by the tech companies. And they’re saying that the world will not go dark just because the FBI can’t get its hands on that master key. So I want to stipulate that it’s clear that both teams recognize that technology companies do have an obligation and have been meeting the obligation to hand over various kinds of data. They’ve been doing it for a long time, particularly meta data, information that’s in the cloud, et cetera.

  • 00:34:59

    And that we are here, whether the team arguing for the motion feels that they need to defend encryption or not, we’re here in light of where this challenge has moved, which is into a world where, as illustrated by the Apple case, the question of what is reasonable and what constitutes the unique ability to help law enforcement is now in contention. So I want to go to — first to the team arguing against the motion and go to the Catherine Crump. Your opponent’s basically saying that if a company has a unique ability to help, that there’s an obligation, a citizens obligation to do what needs to be done to turn that key in the lock. What’s your response to that?

    Catherine Crump:

    They have an obligation that is not an unlimited obligation. This came up in the Apple case where you have to give reasonable assistance. And the real question was, at one point did it become an undue burden that law enforcement is asking a company to do. And Apple maintained that being required to, for example, create a master key, would be an undue burden.

  • 00:36:02

    Relying in part on a lot of the security arguments we talked about earlier, right, their inability to protect the data if they had this key which would then be a type of thing that would be targeted by others.

    John Donvan:

    But what about Stewart’s point that it’s a unique — where there was a unique ability, that that actually changes the standard, when there’s only one agent who can actually help law enforcement, which is what I think is what he was implying by “unique ability,” that that changes the formula somewhat.

    Catherine Crump:

    You know, I don’t think it changes the formula because I think ultimately it comes down to this issue of burdensomeness. And that still is in play. And so someone’s capacity to help is certainly relevant, but it doesn’t mean that they have to do absolutely anything at all.

    John Donvan:

    Can I come —

    Michael Chertoff:

    Can I refine this just slightly, because the issues was not burdensomeness in the sense that it was going to be burdensome on the company. The argument they made was that to create a master key that essentially disables the element of the operating system that shuts it down after you try a few times to break it would compromise not just the single phone, but would compromise all the operating systems for all the phones.

  • 00:37:09

    And that therefore someone got ahold of that capability, it wouldn’t just be the single phone that would be broken, it would be everybody’s phone. And that was the burden they were worried about.

    John Donvan:

    Stewart Baker to respond.

    Stewart Baker:

    So I feel the need to challenge the moderator as well as the other side. I apologize for this. But I have — both of you and the other side suggested that no one’s arguing that there is — there’s no obligation to help in these circumstances. But in fact, when I was preparing for this, I actually talked to the Manhattan district attorney who told me the following story. He said, “You know, we used to take Apple 4, the iPhone 4 to Apple, and they had every ability in the world to decrypt that. We brought them an order, we brought them the phone, they sent us back the contents of the phone.

  • 00:38:03

    It was a very productive relationship. We broke a lot of cases. And then one day, around the time that the FBI versus Apple case was heating up, they called and said, ‘Take back your phones. We’re out of the business of providing assistance to law enforcement. And he said, what, you can’t do it anymore? What’s changed? And they said, basically, our minds have changed. We’re not going to do it anymore. Yes, we can. We still can. We choose not to. That’s cold, and that is defying the obligations to help when you can. And this is why I think we do need to say, yes, companies are required to assist law enforcement when they can, and that is my view of what the motion is, and it is not something that goes without saying in Silicon Valley, because at least Apple believes that they can just decide, for public relations reasons or because it doesn’t fit their litigation strategy, to stop providing help that they can provide.

  • 00:39:05

    John Donvan:

    John, do you want to come in?

    John Yoo:

    Yeah, thanks. I’ll take up the question of encryption because I think there’s also this claim that encryption’s somehow going to make all our data safe, and I don’t think that’s the case either. I think that’s over claiming what technology actually can do. People mentioned John Podesta’s email being hacked by maybe the Russians — who knows? And how did they hack into it? Did this break this strong encryption of his email account? No, they sent him a standard phishing thing, and his password turned out to basically be password. It didn’t make any difference whether encryption was high or low that they got into his email account. The other thing I would say about encryption is, yes, the government was asking Apple to help it identify some kind of way to get into the phone itself, right? There are flaws in all the operating systems, right? There are ways for people to hack into Samsung and Android phones, or Apple phones.

  • 00:40:01

    And then they fix them, right? They download these patches. I have to download one every other day, it sounds like, or it seems like to me. And then you repair the — you repair the flaws. So when — I think when Mike says Apple’s being asked to break its own product and break all the products simultaneously, I don’t think that’s quite accurate. What they’re asking — the government’s asking Apple to do is try to help identify a way to get into this phone held by someone who carried out a terrorist attack in San Bernardino, and then obviously you can fix it after. It’s not like you’re going to publish the flaw and then say everybody, come on in and take John Podesta’s secret emails and now you know his password is password, too.

    John Donvan:

    Okay, which of the two of you would like to respond to that first?

    Michael Chertoff:

    Let me — just on this point, because I think it’s important to understand what’s at stake in the FBI case. First, let me point out they ultimately hired a company that managed to circumvent the operating system feature, and they discovered there was nothing of particular relevance on the phone. And they also had gotten from Apple, in compliance with the order, with all the stuff that was backed up in the cloud.

  • 00:41:05

    The issue that was presented was, do you find a way to create — essentially create a vulnerability or a workaround in that feature of the operating system. And as Stewart kind of [unintelligible] points out, Cyrus Vance, who I’ve actually debated as well, had stacked up a few hundred requests to have phones broken with that system, too. So the reality is, once that vulnerability was created, it was going to be in constant demand for breaking these in the future. Now you might say, well, that’s okay, because they could keep the vulnerability hidden and protected, and so the bad guys wouldn’t get it. And my exhibit A against that is, WannaCry, where shadow brokers posted the vulnerability of Microsoft systems and they shut down the National Health Service. Even the U.S. government can’t protect some of the tools and exports that they’ve put in.

    John Donvan:

    Stewart Baker.

  • 00:42:00

    Stewart Baker:

    So what Mike is sliding over is that what the government asked Apple to do was to use a hole, a backdoor that Apple had already built into its phone. Every — how many people here got a U2 album you didn’t want on your phone? Yeah, that was Apple using its backdoor into your phone, their ability to update your phone any time they want to run any code they want on your phone. Now that’s a big security hole. We’ve heard everybody on the other side of the debate say oh that’s — that’s a fatal security hole, and yet Apple has built it into your phone. They did a balancing. They said, well, on balance, we have to give you security updates. The only way we can do that is if we have this backdoor into your phone. So on balance, it’s better to have the backdoor and to protect is than to have no backdoor, no ability to update your phone. And it’s the ability to update the phone that the FBI had asked Apple to use, to then make a change in the code that would allow you to continue trying combinations after the first 10.

  • 00:43:05

    But that wasn’t the secret that would have gotten somebody into the — into a phone. The secret was how Apple guarantees that its updates reach you. And that backdoor already exists and it’s being used for Apple in order to do good with security –

    John Donvan:

    Okay.

    Stewart Baker:

    — updates and to send you [inaudible] —

    John Donvan:

    Catherine Crump, do you want to respond to that point? Oh, well, do you want to take it, Michael?

    Michael Chertoff:

    Sure.

    John Donvan:

    [inaudible] —

    Michael Chertoff:

    I don’t want to get too bogged down in the factual discussion, but the key thing that they wanted that Apple didn’t want to do was to create an exploit that would then be updated, that would changed the operating system and remove the feature that says, “After you try a certain number of times, everything gets shut down and you’re done.” That was the tool they wanted to have created, and that tool, once it has been created, would have been something that would have been the target for everybody who wants to break into phones.

  • 00:44:00

    But I do think the resolution is broader than that — because again, nobody denies — I’ve never heard anybody in the tech community say, “We’re not going to obey court orders or subpoenas.” And as in this case, if they have access to data, they’ll turn it over. What’s really at stake and what’s been debated — and is at the heart of this is — don’t configure systems that deny you the ability to access information, which means, “Let’s dumb down and lower everybody’s protection.”

    John Donvan:

    Let me — I want to take a question to Catherine Crump, citing something that your opponent John Yoo said. He was talking about there being a balance between privacy on the one hand versus the need to pursue and guarantee national security, and that we’re in a time where the balance — we have to recognize the balance is shifting — that the threats of national security, particularly by terrorist groups that are exploiting this technology — encrypted or not — is on the rise, and obviously dangerous — and that therefore, we need to do a reconfiguration of the privacy issue. What’s your take on that?

  • 00:45:01

    Catherine Crump:

    Well, I think to view this as a privacy versus security debate is to misunderstand this particular debate. It’s a more security versus less security debate. And by using strong encryption, you both secure the privacy of the data and — for all of the reasons Michael Chertoff put out — you also improve security across the board for vulnerable data of corporations, governments, and individuals themselves.

    John Donvan:

    John Yoo, you want to respond?

    John Yoo:

    I would say, again, the touchstone is reasonableness. I don’t see where the constitution says it’s up to Apple to decide what’s reasonable. It’s up to us, the American people, through our government, to decide what that reasonable balance is between privacy and gathering the information to try to increase the security of our country.

    And I think it’s — I still think I hear the argument that Stewart was arguing against, the idea that tech companies are somehow different, that they can willingly and intentionally design their systems to make it impossible for them to comply with these legal requests for information.

  • 00:46:01

    I think that everyone should be potentially subject, and that’s why we have a government and that’s why we have courts. I would much rather have a judge in Washington, D.C. Or Congress pass legislation making that balance, rather than letting Apple, or Facebook, or Google decide — or Samsung in Korea decide what the — what our balance between security and privacy ought to be.

    John Donvan:

    But your opponent, Catherine Crump, is saying the privacy and security issue is not the real issue. She’s saying that it’s security versus more security. And her argument is actually for even greater security, which — I think her implication is it would make it even more difficult for the government to get access –

    John Yoo:

    That’s possible. Let me be the first to say, it’s possible that a consequence of more encryption might actually be better security for our country. I just don’t see why Apple gets to decide that for the United States. I think, if that’s really a consequence of increasing encryption, then our government — who we elect and sent to Washington — should make that call. If it really is — if there really is anything that our government can do to increase national security with no cost to anyone, then it should do it and should have done it after 9/11.

  • 00:47:06

    Tradeoffs exist in everything. There’s no kind of policy where you can have more of all the good stuff and no costs, any of the bad stuff. Everything is a tradeoff. My question is, who conducts the balance? It shouldn’t be tech companies.

    Michael Chertoff:

    Well, I mean, I don’t disagree with this. The government ultimately can pass a law. That’s the debate. The question about the resolution is, should the government pass a law that basically mandates to tech companies or anybody else, “You cannot configure your products in such a way that will not allow you to comply with a court order to turn over information.” And if the decision is that you can’t configure your products, you’re going to wind up — and [unintelligible] I think, an unwise decision — you’re going to wind up hurting the security of everybody else that’s innocent, that’s not the subject of subpoena. You know, it would be as if the government were to pass a law and say,

  • 00:48:00

    “You should not be able to delete any emails ever that you generate” or “You shouldn’t be able to use an application where the message disappears after it’s read. Or you shouldn’t be able to turn your phone off. You should keep it on to record everything you say all the time that. Would make it very easy for law enforcement when they targeted you to find all the evidence against you. But it would also mean that everybody else would be walking around being their own big brother. And that would have not only privacy but security implications.

    John Donvan:

    Stewart Baker, I want to take also to you Catherine Crump’s point about there’s actually a need for greater security for every — for the sake of a greater encrypted power for the sake of everybody’s security, that in fact there should be more, which again would imply less access to the government — by the government to get inside these –

    Stewart Baker:

    I don’t think so. And I want to come back to the point that Apple has a back door into our phones. And they have balanced the value of having the back door —

    John Donvan:

    Well, let me —

    Stewart Baker:

    — for security purposes —

    John Donvan:

    You made it. But I want to get to — the part that “I don’t think so.” Why don’t you think so?

  • 00:49:01

    Stewart Baker:

    Yeah, so I’ll —

    John Donvan:

    [unintelligible]

    Stewart Baker:

    So they bounced. They said, “Well, we give our customers more security, which makes us a more profitable company, and we get to give them U2 albums.” I — what they left out of that balance was the balance of all of us who suffer from crime. That’s not on their balance sheet, and they did not take it into account. What I’m arguing is that we need to take into account the consequences of encryption in deciding whether there ought to be access to the phone.

    And just as Apple made the conclusion that we are better off on balance with a back door for them, but a very well protected back door, there’s been no leaks, we should come to the conclusion that that back door or something like it should be used to protect the rest of us and not just the profits of Apple.

    John Donvan:

    Catherine Crump to respond.

  • 00:50:00

    Catherine Crump:

    I don’t think any of us disagree that we want to help people who are victims of crime. The question is what is the best way to do that. And it would be — the consensus of tech experts is that if you create these back doors, then you will increase the vulnerability of people across the board. And given the fact that the vast majority of people are innocent and we all have sensitive data stored on these systems, the back door is the wrong way to go.

    Stewart Baker:

    Can I just ask —

    John Donvan:

    Sure.

    Stewart Baker:

    — one question, Catherine? If the consensus of everybody in the tech community is that back doors are a bad idea, why is it that Apple has a back door? Why is it that Microsoft gives us automatic updates that we can’t turn down? They have built a back door in.

    Catherine Crump:

    You know, I’m not a computer scientist. I’m relying on people like Susan Lundow and Ed Salten [spelled phonetically] at Princeton who’ve made this point, right? None of us on this stage are computer scientists. But their considered view is that by installing a back door that would allow you to override encryption, you will make that data less secure.

  • 00:51:05

    Stewart Baker:

    Yeah, the back doors you’re talking about don’t override encryption. I’m not worried about bad guys stealing my U2 album off my iPhone.

    Male Speaker:

    Please.

    Stewart Baker:

    The back doors — the back doors we’re talking about and what is being suggested is the back door that allows you to decrypt something that’s encrypted.

    Michael Chertoff:

    But I’m sorry. Apple can put anything on your phone to run anywhere and make it do anything.

    Male Speaker:

    But if you —

    Michael Chertoff:

    — everything you do, that is a back door in space.

    Male Speaker:

    But if you have an encrypted app, then it’s encrypted, and you have the key and the sender has the key, period. And these apps are designed, whether it be signal or a whole lot of other things, these things are designed exactly so that only two people, the sender and the recipient can get it. And that is the issue that Cyrus Vance wants to change. He wants to have the government or somebody else to have a second key or a vulnerability that would allow it to be degraded.

  • 00:52:04

    Male Speaker:

    That is not our view. Our view is that tech companies should help where they can. Apple has a back door. They should use it to help. If you built a product that doesn’t have a back door, nobody is saying you have to help because you can’t.

    John Donvan:

    John Yoo, I want to go back to your opening statement in talking — you know, citing the Fourth Amendment, and to understand for you where the boundaries of reasonableness are. Your opponents are arguing that the request to Apple — and we’re using Apple as an illustration. We’re not really litigating the Apple case, but it’s illustrative that the request that tech companies essentially assist the Feds in busting their encryption is unreasonable. Is that –

    John Yoo:

    – in the end we should recognize it’s really the Israelis who are good at this. They’re the ones that the FBI hired to break into the phone. We could actually do it ourselves.

    John Donvan:

    All right. Is it unreasonable?

    John Yoo:

    No. I don’t — again, reasonableness, there’s no formula for reasonableness other than balanced security —

  • 00:53:03

    John Donvan:

    Okay, but you’re — let’s make him a judge.

    John Yoo:

    Oh, God, I think the Senate would actually shut down and go on strike if I was one — what you should do, you should nominate me actually, because then we will shut down the — no. I think, you know, if — but then who’s going to oversee the impeachment trial? No, so, I — I think that if you’re a judge and you look at the gains to our security — and what worries me is not only do we have that string of terrorist attacks in the U.S., but it’s going to get worse, not better, because people who analyze the Middle East say, as we encounter more battlefield successes on the ground and we start to eliminate this [unintelligible] caliphate, ISIS is going to send more people abroad and it’s going to try to encourage more people to carry out the kinds of attacks we’ve been seeing in Paris, London, but also in the United States. That you balance against the loss of privacy. Again, no one, I don’t think, is going to admit there’s no loss of privacy. I’m —

  • 00:54:00

    John Donvan:

    But where would your balance be on the issue of asking Apple to crack its security? And with your opponents suggesting the risk there, number one, is to Apple’s business model, they sell encryption in the phones. Number two, it’s the risk that the crack would get out to bad guys.

    John Yoo:

    Yeah, so I actually don’t have a big problem with asking Apple to do that. I mean, asking Apple to do it or the government compelling Apple to do that, given the circumstances of what happened in San Bernardino where you’re trying to track down —

    John Donvan:

    Okay, talk about that. How come?

    John Yoo:

    Because this guy apparently, we thought, might — I should say we, I was in the government. Obama wouldn’t hire me for some reason. So we — the government thought there might be information on his phone that might lead to a broader conspiracy. Mike’s right, it turned out it didn’t. But you don’t know that beforehand. And that’s an important — you have to put yourself in the position of the people who are trying to protect their country at the time they’re doing it. We don’t know how big the conspiracy was. I think it’s going to become a bigger and broader problem, these series of attacks.

  • 00:55:00

    The loss of privacy I think, on the other hand, is up to us as voters to decide. Again, I don’t think we should say, oh, Apple gets to decide whether the loss of privacy —

    John Donvan:

    Let me take — let me take your justification for your position to Catherine Crump and how do you respond to everything that you just heard John say?

    Catherine Crump:

    I think that the encryption, with users controlling the key, means that users are in control of their own data. It juxtaposes them between the companies and others. And overriding that creates security problems.

    John Donvan:

    But he’s saying — John’s saying life and death. Life and death just trumps it all. I’m going to change that word.

    [Laughter]

    John Yoo:

    I never use the word “trump.”

    John Donvan:

    Yeah.

    John Yoo:

    [unintelligible] will never — it’s like to play bridge.

    John Donvan:

    The life and death issue is decisive and it [unintelligible].

    Catherine Crump:

    And no one is denying that it can be a serious cost to law enforcement not to be able to access the content of someone’s phone. The question is, how do you balance that cost against the cost of not having encryption, and particularly in an era where law enforcement has lots of other information that’s available to you.

  • 00:56:05

    You — every time you walk around the city, you’re picked up on myriad surveillance cameras. Encryption doesn’t change that. Automatic license plate readers blanket the streets. Encryption doesn’t change that. Even when you can’t access the content of communications, you will often be able to identify meta data about the communication, who sent the email, what time the phone calls took place. And many law enforcement and national security officials believe that that meta data can actually be what’s really important to solving crimes.

    John Donvan:

    So interesting that opponents are saying, Stewart Baker, that you don’t really need that. There’s too much other information yielding too much information, and also, as Michael Chertoff pointed out, when the Feds actually finally cracked the Apple phone, there was apparently nothing on it. So what’s your response to that?

    Stewart Baker:

    That would be overstating the case.

    John Donvan:

    I know you’re not — I know you’re not arguing for the cracking —

  • 00:57:00

    Stewart Baker:

    There’s never a guarantee when you go into a phone that you’re going to find the evidence you’re hoping for. But on average, you do. You know, the argument is sort of boiled down to, well, it’s a great time to be a cop because there’s so much data. Technology is making life easier for you. And in some ways, that’s true, but you know, technology is making life great for the criminals too.

    Prior to 2016, no one imagined that the Russians could change the outcome of an election just by sitting in Moscow and having fun with the files that they stole, or that ISIS could recruit teenagers in Minneapolis without ever coming into the United States to carry out attacks. Technology is transforming crime in the same way it’s transforming crime-busting, but it’s not clear that on balance law enforcement ends up better.

  • 00:58:01

    And when we can see a real criminal law enforcement problem arising from new technology, of course we ought to consider regulating it. I should stress this is not the argument we have to make to win this. All we have to say is they have an obligation to help, and they are not doing it.

    John Donvan:

    I’d like to go to audience questions now. Right here behind the bars, you’re wearing a white shirt, if you can stand up, a microphone will come down from here, right on this side. What’s your name, sir?

    Male Speaker:

    My question is —

    John Donvan:

    Could you tell us your name, please?

    Male Speaker:

    Stephen Maine [spelled phonetically], resident of — work in San Francisco, resident Marain County [spelled phonetically]. Isn’t the core issue of the Fourth Amendment the protection of the expectation of confidentiality and the right of privacy, right?

    John Donvan:

    Great question.

    Male Speaker:

    Isn’t that the — isn’t that the core?

    John Donvan:

    I think that’s a challenge to John Yoo’s side, so I’d like to take it to John Yoo.

    John Yoo:

    Yeah, I’m happy to hide behind the Supreme Court on this one.

  • 00:59:00

    They don’t say that the Fourth Amendment itself puts that value above all others. It says you balance it. You’re quite right: The privacy interest, which we actually didn’t talk about that much about the laws, so thanks for bringing it up, is the privacy is the reasonable expectation of society and privacy. And it could be phone calls, written letters, whatever. But you always balance that against security, right? I mean the Supreme Court’s been very clear that we have to balance the two values. It’s hard to actually figure out how do we measure what society’s reasonable expectation of privacy is, and that’s why when we’ve had this technological changes in the past with the telegraph, the telephones, money transfers, ultimately we’ve asked Congress to step in and pass a law and make a judgment. For the — in the beginning the courts have done it, eventually Congress. And in no case did our elected representatives or any of the judges say privacy I was going to say trumps every — privacy trumps all other values. It’s what’s reasonable to us as a society to balance the two.

  • 01:00:02

    John Donvan:

    So I want to let Michael Chertoff actually follow up, if he’d like to, or Catherine, if you’d like to. Catherine.

    Catherine Crump:

    Yeah, well, I think we agree about what the applicable standard is, right? It’s a balance between an individual’s expectation of privacy and then the public safety means on the other side. I think we just disagree about how that comes out in this particular case.

    John Donvan:

    Down near the front row here.

    Male Speaker:

    My name’s Raphael. I’m actually going to Berkeley Law.

    [Laughter]

    My question is essentially, you’re saying that tech companies don’t have the ability to help because the — because the data’s encrypted and the user has the key. What I’m saying is, does the company still have the key because of artificial intelligence? To illustrate that, Gmail now allows you to auto reply. So, that’s based on your content. Can you say that tech companies will — do not have the key?

    Michael Chertoff:

    So —

    John Donvan:

    Okay. Michael Chertoff.

  • 01:01:00

    Michael Chertoff:

    Yeah. So, some companies do keep a key — or some enterprises do keep a duplicate key because they want their employees, for example — they want to be able to see what their employees are doing. No one is arguing on our side that tech companies should disobey court orders. If you have the capability — if you have a key, a duplicate key, and a court ordered you to turn it over, game over. You turn it over. The question in the resolution is, is there an obligation to help — meaning, do you have to configure your system in such a way that you’ll always have that duplicate key? Some companies don’t maintain the duplicate key. And in that instance, they can’t comply. And what the resolution would — if Congress were to adopt the principle in the resolution, Congress would say, “Oh, when you design encryption, you must always have a duplicate key or a backdoor.” They tried to do that about 20 years ago with respect to a — something we call a chipper [spelled phonetically] clip.

  • 01:02:01

    And it kind of failed, because there were problems with the way they were being executed, in terms of being vulnerable. So, no one’s arguing, “Don’t comply with an order if you can.” What we’re arguing is you’re not obliged to arrange your life so it’s easy for the government to get a court order to have you turn this over.

    John Donvan:

    Stewart Baker to respond?

    Stewart Baker:

    So, I — you made a good point, that for some companies, having that data is so important that they discourage encryption. Their business model is such they want the date. They don’t really want you to encrypt it. So, I — and so, they make choices that are dis-incentivizing encryption. Apple doesn’t live off the data, and they have created a market niche for themselves that says, “Come to us. We don’t use your data.” And there’s a perfectly good argument that they were using the San Bernardino case as an exercise in free marketing, to show that they were the privacy protectors and that Google and their other competitors are not.

  • 01:03:04

    I think, at the end of the day, though, the question is, is anybody here comfortable saying, “We’re going to trust our privacy and our security to the marketing and the technological profit-driven decisions of the tech companies?” Does anybody think they have our interests at heart, and our interests — not just selling us stuff, plus keeping us safe? I just don’t believe it, and I don’t think we should rely on them to make that call.

    Catherine Crump:

    So, I think your comments, though, raise an interesting point, which is what are the market incentives of tech companies? And for a lot of purposes, tech companies are not going to want to have data encrypted. So, for example, your Gmail account isn’t encrypted because you’re going to want certain functionalities, and certain companies are going to want to be able to access the data in order to sell you advertisements, for example. So, I think you need to think about the scope of the encryption problem as being limited, because there are a lot of market incentives on the other side that are going to limit the use of this tool.

  • 01:04:03

    John Yoo:

    I think this raises, actually, an interesting point — goes back to the first —

    John Donvan:

    John Yoo.

    John Yoo:

    — question too, about encryption. I find it actually strange, as a society — we’re more than happy to surrender lots of privacy to companies to mine our emails, and then to pop up weird ads about things that they think I want to buy, places I’ve been. I think that when we consent — you know, we actually decide what’s reasonable — are we going to let tech companies not only design systems that use artificial intelligence to make them unbreakable, but then also just say, “Yeah, we’re not going to help you try to figure out a way to defeat it” in the next crisis? I think, as a society, I find it very likely we’re going to say, “The government should at least have the same right to mine that data” — look, as we’re giving all these companies already.

    John Donvan:

    I want to remind you that we are in the question and answer section of this Intelligence Squared U.S. debate. I’m John Donvan, your host. We have four debaters, two teams of two, debating this motion: Tech Companies Should Be Required To Help Law Enforcement Execute Search Warrants To Access Customer Data. Going back to audience questions.

  • 01:05:02

    And — err — [unintelligible] — ma’am in the red sweater, if you could stand up.

    Female Speaker:

    Hi. My name is Kate Conger [spelled phonetically]. My question is about the life and death issue that you raised. Earlier you were talking about how law enforcement needs access to encrypted messages to save lives. And I’m wondering how you balance those lives with the lives of our service men and women overseas whose locations are protected by encryption — how you balance those lives against the lives of victims of intimate partner violence who might be hiding their information, their location from their spouse via encryption. Why are the lives of terror victims worth more than the lives of service men and women — of women who are being killed by their partners?

    John Donvan:

    And your question is directed to?

  • 01:06:01

    Female Speaker:

    Stewart? Would you like to take that?

    John Donvan:

    Okay, Stewart. Yeah, no, first name basis.

    Stewart Baker:

    Yes.

    [Laughter]

    Catherine Crump:

    [unintelligible] doesn’t have a tie on.

    Stewart Baker:

    Yeah, exactly. Yeah, come on. So no one is arguing that what we want is completely insecure phones that give away data that can get people killed. As I said, Apple has built the technology that allows them to modify phones one at a time if necessary. And they have protected that successfully. And that means that the data that they’ve protected has not been given away to lead the deaths of innocents. But they could use that technology to protect innocents, and they’re not doing it. And in my view, they should.

    John Donvan:

    Other side like to respond?

    Michael Chertoff:

    I — you know, we could go around and around arguing particular facts in the case. I think we’ll — they’ve wanted Apple to do a modification that would ultimately, if it got out, affected all phones.

  • 01:07:05

    But let’s put —

    John Donvan:

    That’s — I just want to say that that’s certainly the way the argue — Apple presented —

    [talking simultaneously]

    John Donvan:

    Let Michael finish.

    Michael Chertoff:

    But we’re not on trial. We don’t have evidence here. So let’s take the broader proposition. If Apple didn’t add U2 music to your phone, I think your comment is dead right, exactly right. There is real security value to encryption. And if you — if you require companies that have encryption without U2 updates to have a back door, you would weaken that encryption. That’s what all the engineers say. And that means if somebody — if a bad guy either discovered the vulnerability or got ahold of the exploit, they would then have the ability to compromise the safety of the people you are describing. And if you said, well, that’s okay, the government can protect it, I just have one word: WannaCry [spelled phonetically].

  • 01:08:00

    Apparently, as reported in the press, the government wasn’t so good at protecting the exploit based on a generally available vulnerability, and hospitals were shut down, and there was a global impact. And that’s exactly the kind of thing you don’t want to do.

    John Yoo:

    Can I jump in?

    John Donvan:

    Yes, please do.

    [Applause]

    John Yoo:

    Two points. We’ve heard now this debate about whether it’s Apple’s own back door that is at risk or the magic of getting rid of the limits on how many times you can try to guess a password before the machine wipes out its data. Well, you know, if you’ve done any coding at all, you know that the wipeout of the data has — there’s a line in there that says, “After X tries, wipe the data.” And if you went in and change the 10 to 1 million, you would have done what the government asked. That is not a secret. That is not hard.

  • 01:09:01

    There is nothing that protects you against that change being made other than Apple’s secret which is how to get the phone to accept the code change that it wants. There’s no — and there’s no requirement to — there was no argument by the FBI that the secret for how to do that should be given to the United States government to protect in some database that was subject to a leak. Apple could have kept that secret and just fixed the phone so that it didn’t wipe out the data after 10 tries. That’s all they had to do, and they chose not to do it.

    John Donvan:

    Man in the light blue shirt because I saw you shaking your head during Stewart’s comment. But I don’t want you to argue with Stewart. I want you to ask a question. If you could stand up and tell us your name, please.

    Male Speaker:

    Well, I heard that —

    John Donvan:

    What’s your name?

    Male Speaker:

    Walter Mafaley [spelled phonetically]

    John Donvan:

    Thanks.

    Male Speaker:

    I’m a former software engineer. I work for a legal department at a big tech company.

  • 01:10:01

    And I too will be attending Berkeley law.

    John Donvan:

    Yes!

    [Laughter]

    Male Speaker:

    You’re all going to take Catherine’s classes then, you know. You don’t want to read Alexander Hamilton with me, do they?

    Male Speaker:

    Well, professor, I wanted to go back to a point you made a little earlier about how cryptography encryption is not a panacea. And you’re absolutely right. The general consensus in computer security circles is that there’s always another flaw. And in fact, the Apple and FBI thing was — their fight was mooted when the FBI found another way into the phone without Apple’s help.

    John Donvan:

    I need you to go to a question now.

    Male Speaker:

    So because there are more flaws out there, doesn’t that put the burden back on law enforcement rather than asking tech companies to help.

    John Donvan:

    I’d like to let this side answer first, and then I would like to hear the other side’s response to that. John Yoo.

    John Yoo:

    I think — I’m not sure exactly what the — how to answer the question.

  • 01:11:00

    But what I — I think there’s a false choice here that’s being presented by our worthy and handsome and attractive opponents. And that’s, there’s a choice between letting the government have access — and I think you heard in the last question — and complete vulnerability. I don’t think that’s true. I think it’s very much as you describe. There are these programs and operating systems, and then there are flaws, and then we correct them. And sometimes we can use the flaws to the society’s advantage, and then we correct them. It’s not the case — you know, some people often use this analogy of locks on doors. And I’ve heard it said, oh, what you’re asking Apple to do is to change the locks of — no one can have locks on their doors. I don’t think that’s really what’s going on based on what I understand about — I’m not a computer scientist, but I did have a TRS80 back in the early 1980s, late ’70s. I bet very people here can make that same claim. They had a Radio Shack —

    John Donvan:

    I can.

    John Yoo:

    — computer as a kid.

    Male Speaker:

    Oh, you have so dated yourself. Probably liked Katrina and the Waves, too, as his favorite band.

    John Donvan:

    Ultimately you made a very good wheel chuck.

  • 01:12:00

    John Yoo:

    But the point is like, right, that it’s not a choice that if you help the government all of a sudden everybody’s data is suddenly visible. Then I think we use, you know, adaption and computer scientists will fix the flaw and the locks are restored. All we’re asking about is to say to the locksmith, come to this house. Please open this lock. We’re not asking you to take all the locks off all the doors.

    John Donvan:

    Catherine Crump.

    Catherine Crump:

    Yeah, I think your point is a good one. Encryption, while it may be the best tool we have available often isn’t perfect. The Apple case illustrated that. They were able to get in using a vulnerability that they purchased. There will continue to be vulnerabilities. And particularly in high-profile, high-value investigations, they may continue to be used.

    John Donvan:

    Another question? Right down front, sir. A mic’s going to come for you. Just one second.

    Male Speaker:

    Now, you realize you just called on the former chief justice of the State of California. So whatever he says I’m going to agree with it.

    [Laughter]

    Male Speaker:

    I’m asking —

  • 01:13:00

    John Donvan:

    Please tell us your name.

    Male Speaker:

    — as a layperson. Ronald George.

    John Donvan:

    Thank you.

    Male Speaker:

    I just wonder whether, in weighing privacy interests against security interests if Congress, in considering a law requiring that such help be provided, were to be presented with credible evidence that there were plans to import a nuclear device into the United States whether that would change the position that the no side have, that kind of substantial showing, or whether you would still adhere to the same position.

    John Donvan:

    Michael Chertoff.

    Michael Chertoff:

    Well, you know, I think — and the position we have, again, is that the companies have to comply with the law and their rules. The issue is when a company’s capabilities are configured in such a way that they just don’t have access to the information, that’s going to frustrate law enforcement. If law enforcement can figure out its own way to get that, God bless them.

  • 01:14:00

    But the real challenge presented here, and the argument that people in Congress, some of them have made is, you should prevent tech companies from organizing themselves in such a way that they don’t have access to all the information whether they’re required to turn it over. So let’s pick something a little less esoteric in encryption. There are applications now, messaging applications, which ones you’ve read the message, it disappears. And should the government be able to say, wait a second, terrorists could be communicating on when that shipment of nuclear materials coming in and the message is going to disappear. We’re never going to know what was said. So let’s make every company require that all those messages, although they appear to disappear, they actually get stored. And that would apply to everybody because you don’t — congress doesn’t pass a law knowing who a particular terrorist is, the law is generally applicable. I would argue two things: That it would be inappropriate to pass a law like that because, at some point, you might really need to get a message that otherwise wouldn’t be saved.

  • 01:15:03

    But I would also tell you, based on my experience, going back to a period after 9/11, the kind of data that is available now that is turned over by the tech companies and is generated by the tech companies precisely because they are business people, makes the kind of stuff we got in 2001 look like child’s play.

    It would have been a dream to have the kind of data that’s available now back 10 years ago when we were responding to 9/11. So if you look at technology development as to whether net net has been good for security, I will tell you it has dramatically tipped the balance in favor of security.

    John Donvan:

    Other side like to answer that? Stewart Baker.

    Stewart Baker:

    Yeah, very briefly. I think that example shows that there are times when we simply will not leave the decision to the companies. It’s — and if you believe that, if you thought there was a phone that had data about the importation of a nuclear device into the United States, no one would be saying, oh, well, it’s Apple choice about whether they’re going to use their backdoor to provide access.

  • 01:16:10

    We would say this is a choice that ought to be made by society as a whole through our elected representatives. I — and if you believe that, then I think the answer to the question is, yes, there are times when companies should be required to help law enforcement.

    John Yoo:

    Also, we would help American companies would willingly try to help in such a situation, shouldn’t need the compulsion of the law. What worries us, I think, is this growing atmosphere that it’s okay for tech companies to say no, we’re not going to help the government even in a scale of a threat as high as the one you’re proposing, Mr. Chief Justice.

    John Donvan:

    Ma’am on the aisle, there. If you could stand up, they’ll be able to see you for the microphone.

    Female Speaker:

    Hi, my name’s Audrey. I’m the genetic counselor, actually, so I work in genetic testing.

  • 01:17:01

    This is a question for John or Stewart — am agonistic to which one of you answers the question. But the cost of genetic testing has gotten down because — basically dropped dramatically, and there’s a real philosophy that democratizing the ability of people to have access to the information really requires the tech companies. It’s a huge amount of data, and that data needs to get stored and that data needs to get analyzed. And my question would be that do you think there should be some kind of backdoor key? Should there be something set up for genetic — to store the genetic information where the companies need to make it acceptable to law enforcement or is there a case where that’s something that’s actually going to be useful, where it makes sense to compel a company to reveal that type of information, where they’re not really making a profit off of that type of backdoor?

    John Yoo:

    So I’ll answer just because my brother is in this industry, too. So it seems to me that we’ve already made a choice about the privacy of genetic data — not an except hypothetical you have, but DNA testing to track down crimes.

  • 01:18:04

    You could have had a regime or a world where you could have — we could have said the government is not allowed to know your DNA sequence. It cannot do DNA testing. That’s part of your right to privacy, your individual right, and that would be very similar to your idea. And if we’re going to — if I’m going to hire a genetic testing company to sequence my code, the government can never look at that either. I mean, companies could take that position. But we don’t have that view, right? We are actually expanding quite broadly the use of DNA testing to help us, you know, solve crimes. And, you know, it’s not just — it’s not just the privacy [unintelligible] — you know, you’re losing a little bit of privacy by letting the government do that, but we are also protecting victims. We’re solving a lot of crimes. We’re also proving that some people who are convicted were actually innocent and those people are being released from jail. I would just say as a society we’ve already made the judgment you’re asking about, that it’s reasonable in certain circumstances for the government to have access to your DNA testing in order to solve a crime.

  • 01:19:05

    John Donvan:

    Catherine, would you like to take a crack at that?

    Catherine Crump:

    Yeah, I don’t think there’s a point of disagreement here. It sounds like in your example the company itself can access the data, and in that circumstance if the company is capable of accessing the data, they need to comply with whatever lawful process the government uses to get the data. I think this debate is more focused on circumstances in which the data is protected even from the company.

    John Donvan:

    And that concludes round two of this Intelligence Squared US debate, where our motion is tech companies should be required to help law enforcement execute search warrants to access customer data.

    [Applause]

    And now we move on to round three. Round three are closing statements by each debater in turn. Here making his closing statement in support of the motion Stewart Baker, former Assistant Secretary for Policy at the Department of Homeland Security.

    Stewart Baker:

    First, for those of you who come back, I should introduce you to a concept that I learned at a conference yesterday for the Israelis — which is the Israeli question — which is a speech followed by the words, “Don’t you agree?”

  • 01:20:08

    [Laughter]

    I was thinking about this issue and researching it, and I came across a case with a woman named Brittany Mills [spelled phonetically], who was — who answered her door one day and was shot dead at point blank range. There are no — the police know she knew the person she opened the door for. They know nothing else. They do know she had an iPhone, that she kept a diary on it. Her mother says that she was careful to keep those records. Apple was not prepared to provide any assistance in finding out what’s on that phone. I — that can’t be right. Tim Cook has given many speeches about how companies have values because people have values and Apple has values.

  • 01:21:04

    And they care about the environment, and they work hard — even sacrifice profits because of their concern for the environment. I think the message I would want to send them out of this debate is they need to have a concern for the Brittany Mills of the world as well. That privacy they were providing is not doing her any good, and she never wanted this kind of privacy. And so, I would ask that you vote to say “Yes,” companies can be required to help law enforcement to execute search warrants. Thank you.

    John Donvan:

    Thank you, Stewart Baker. [Applause] And here making her closing statement against the mother, Catherine Crump, acting director of the Samuelsson Law Technology and Public Policy Clinic at Berkeley Law.

    Catherine Crump:

    We all want to help the Berkeley — Brittany Millses of the world. But the question here is where is the greater good? Are we going to make everyone’s communication insecure in order to create a backdoor?

  • 01:22:06

    And I’ll just tell one story — which is that 20 years ago, the United States, in a law called Calia [spelled phonetically] decided — and most other countries decided to create a requirement that there would be a backdoor for telephone switches. About 10 years ago, someone illegally wiretapped the phones of many people in Greece using one of these backdoors. It included the prime minister. It included the mayor of Athens, and so on and so forth. So, when you create these backdoors, they are vulnerable. They can be abused. And the better choice is to try to secure everyone’s data across the board.

    John Donvan:

    Thank you, Catherine Crump.

    [Applause]

    And now making his closing statement in support of the motion, John Yoo, law professor at UC Berkeley.

    John Yoo:

    So, unlike my other panelists, I don’t have a good story. I’m not Irish, I’m Korean. We’re not good at stories.

  • 01:23:01

    So, I have no witty thing that’s going to sum it all up, the way that JFK or Tip O’Neill could have. I wish —

    Stewart Baker:

    He has a Korean mom. He cannot go home of [inaudible] —

    John Yoo:

    That is true. Please, please vote for us. My mom is asking you —

    [Laughter]

    — actually, I wanted to go back to something Jeff Rosen [spelled phonetically] said when he started this whole thing. And he said he always asked himself what Brandeis would do. And I actually always ask myself, “What would Hamilton do?” And the reason I ask is because Hamilton is so cool and hip right now. They even make rap music about him. I’m not rich enough to have the pull to get actually in to see the show, but I hear Hamilton even talks in rap. This is amazing to me. I’ve been studying Hamilton for 25 years and I love the guy. And I think what Hamilton said is something we should come back to, because Hamilton was involved with drafting the constitution. He was the first Treasury Secretary. You all know this because all of you have seen the play. Hamilton said the primary mission — the purpose of government is the protection of the community from attacks.

  • 01:24:01

    He didn’t say it trumped everything. It doesn’t mean that we have to live in a world with no protections, or no security, or no privacy from — for our data. But it means that ultimately, when it comes down to it, and this is the question, I think, that Chief Justice [unintelligible] properly raised — is we all have to balance the needs of the government against our privacy rights. And as a society, we can sometimes and should decide that we want to trade off some amount of privacy for security. Anyone who’s telling you that that’s a false choice, I think, is not being truthful. There’s always a tradeoff in anything we do, any government policy that we reach. And I think, in this case, all we want to acknowledge in asking you to vote “Yes” for the resolution is that we’re asking you to acknowledge that the government at some times has a right to protect us — that should sometimes, in the right circumstances to protect us that should sometimes, in the right circumstances, require us to give up a small amount of privacy.

    John Donvan:

    Thank you, John Yoo.

    [Applause]

  • 01:25:00

    And finally, making his closing statement against the moment, Michael Chertoff, executive chairman and cofounder of the Chertoff Group.

    Michael Chertoff:

    Well, thanks, everybody. Very stimulating debate and great questions. Look, I know Stewart likes to talk a lot about the Apple phone case. And we’re not going to resolve the engineering question about whether what would have been required would have been to create a general vulnerability. But that’s not what the resolution is about. The resolution is about whether tech companies or anybody for that matter should be required to help do whatever can be done in order to make things accessible to law enforcement. There’s no doubt Congress can pass a law. That’s not the issue. The question is, would that be wise. And if you apply it in this circumstance, what you see the resolution says is, you should have to configure your platforms in such a way that you can always access information when there is a lawful demand to do so.

  • 01:26:00

    The problem with that is it doesn’t create security for everybody. It creates security in some circumstances. If you look at what goes on around our world, if you look at the $80 million stolen from the bank of Bangladesh, if you look at the efforts to influence elections in France, if you look at the personal value that is stolen and the 500 million Yahoo! accounts that were hacked by the Russians who got dieted, that’s 500 million individuals whose personal information is out there. You realize that if you’re leaking encryption or you limit the ability to protect the data, you are putting the security of the many at risk simply because the government would be benefited and in some cases to get access to the data. With all the tools the government has that the companies give them, data backed up to the cloud, meta data, locational data, sometimes the government will have to do it the hard way. But in the greater good of security for everybody, that may be the right way.

  • 01:27:05

    And that’s what I would say Congress ought to bear in mind when they look at this problem. Thank you very much.

    John Donvan:

    Thank you Michael Chertoff. And that concludes round three of this Intelligence Squared U.S. debate.

    [Applause]

    And now it’s time to learn which side you feel has argued the best. We want to ask you again to go to the key pads at your seat and vote for a second time. Take a look at the motion. Tech companies should be required to help law enforcement execute search warrants to access customer data. Push number one if you agree with the motion, the motion — the side argued by this team. Push two if you disagree with the motion, this team. Push number three if you became or remain undecided.

    We give victory to the team whose numbers have changed the most in percentage points between the first and the second vote. So it’s the different between the first and the second vote as opposed to the absolute vote that determines victory for one team or the other.

  • 01:28:05

    It’ll take about a minute and a half for the results to come in. But while that’s happening, I just want to say a couple of things. As I mentioned in the beginning, the goal of Intelligence Squared U.S. is to raise the level of public discourse and to prove that people with disparate points of view, with real disagreements on principle nevertheless can sit down, exchange ideas, speak to one another civilly, maybe even change each other’s minds. And I just want to say that the spirit in which these four debaters did that, the game they brought to the stage absolutely lived up to our principles. And I want to thank all of you for what you did.

    [Applause]

    I also want to, again, thank the great Jeffrey Rosen for being our partner in this with the National Constitution Center. If you haven’t been to Philadelphia, we’ve done debates there. The center itself is spectacular. But as Jeffrey also pointed out, the center goes far beyond Philadelphia.

  • 01:29:04

    It’s a national organization. It’s getting everywhere. Los Angeles is not that far a hop. And so you heard about the program coming up there. But keep an eye on the NCC. It’s really going places, and it’s great to be partner with them. I also want to — I want to point out this about Intelligence Squared U.S. We’re a nonprofit organization. We put these debates on, and then we release them for free to the public. As I mentioned., the podcast is out there. The public radio program is out there. We’re in a lot of schools. A lot of schools now actually incorporate us as part of the curriculum, particularly in high schools and the upper grades of elementary school, and we’re very, very proud of that. But I also want to say that we depend enormously on public support to continue that but I also want to say that we depend enormously on public support to continue that mission going. So, if you like what you saw, if you like what you do — what we do, we’d appreciate it if you could give us some support, and there’s a way to do it with your encrypted cell phone right now.

    [Laughter]

  • 01:30:00

    If you — you — if you text the word “DEBATE” to the following number, you’ll get a link and you can make a contribution. And I know it’s a cliché, but big or small, they all count. We appreciate them all. And so, that number is 797979, whose secret meaning is — absolutely nothing. It’s random. But we would greatly, greatly appreciate that. So, reminding you — the motion is this: Tech Companies Should Be Required To Help Law Enforcement Execute Search Warrants To Access Customer Data. Before the debate, in polling the live audience here in San Francisco, 26 percent agreed with this motion. 47 percent were against the motion. 27 percent were undecided. Those were the first results. One more time I’ll say this — it’s the difference between the first and the second vote that determines our winner. So, let’s look at the second vote. The team arguing for the motion — their first vote was 26 percent. Their second vote, 36 percent. They went up 10 percentage points.

    Stewart Baker:

    Wow!

    John Donvan:

    That’s the number to beat. Let’s see the team against — arguing against the motion.

  • 01:31:00

    Their first vote was 47 percent. Their second vote was 58 percent —

    Stewart Baker:

    Oh, come on!

    John Donvan:

    They got 11 percentage points. They just —

    [Applause]

    — snuck in. Congratulations to the team arguing against the motion. Our congratulations to them. Thank you from me, John Donvan, and Intelligence Squared U.S. We’ll see you next time.

    [Applause]

    [end of transcript]

JOIN THE CONVERSATION
40

Have an idea for a debate or have a question for the Open to Debate Team?

DEBATE COMMUNITY
Join a community of social and intellectual leaders that truly value the free exchange of ideas.
EDUCATIONAL BRIEFS
Readings on our weekly debates, debater editorials, and news on issues that affect our everyday lives.
SUPPORT OPEN-MINDED DEBATE
Help us bring debate to communities and classrooms across the nation.